Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: Update pylint to 2.17.4 #24700

Merged
merged 20 commits into from
Jul 25, 2023
Merged
Show file tree
Hide file tree
Changes from 11 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .pylintrc
Original file line number Diff line number Diff line change
Expand Up @@ -86,6 +86,7 @@ disable=
missing-docstring,
duplicate-code,
unspecified-encoding,
cyclic-import,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are we blanket disabling this message? This message is typically useful and thus I would be somewhat hesitant to ignore it globally.

Copy link
Contributor Author

@EugeneTorap EugeneTorap Jul 20, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There're hundreds of cyclic-import errors in the project.
@john-bodley I believe that your SIP-92 Proposal for restructuring the Python code base should solve the problem!

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@EugeneTorap as far as I'm aware the cyclic-import error existed in Pylint 2.9.6 and thus I'm somewhat perplexed as why this is significantly more problematic now. In general we shouldn't be relaxing most Pylint rules and thus I'm somewhat hesitant to approve something which in theory could lead to a regression.

When I pulled your branch, removed line #89 from the .pylintrc file, and ran,

python3.9 -m pip install -r requirements/integration.txt
tox -e pylint

I didn't see any cyclic-import errors being reported.

Note I did see some errors with the .pylintrc file:

.pylintrc:1:0: E0015: Unrecognized option found: optimize-ast, files-output, argument-name-hint, method-name-hint, variable-name-hint, inlinevar-name-hint, const-name-hint, class-name-hint, class-attribute-name-hint, module-name-hint, attr-name-hint, function-name-hint, no-space-check (unrecognized-option)

it seems like some of these options are likely deprecated.

# re-enable once this no longer raises false positives
too-many-instance-attributes

Expand Down
2 changes: 1 addition & 1 deletion requirements/base.txt
Original file line number Diff line number Diff line change
Expand Up @@ -305,7 +305,7 @@ werkzeug==2.3.3
# flask
# flask-jwt-extended
# flask-login
wrapt==1.12.1
wrapt==1.15.0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a tad perplexed how/where this was bumped as it's a non-direct requirement and as far as I can tell the deprecated package (which it is a direct requirement of) wasn't bumped.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When I updated pylint then pylint updated astroid then astroid updated wrapt to 1.15.0

image

# via deprecated
wtforms==2.3.3
# via
Expand Down
2 changes: 1 addition & 1 deletion requirements/development.in
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
ipython
progress>=1.5,<2
pyinstrument>=4.0.2,<5
pylint==2.9.6
pylint==2.14.5
python-ldap>=3.4.3
setuptools>=65.5.1
sqloxide
16 changes: 11 additions & 5 deletions requirements/development.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# SHA1:4c0ce3a84b01a5a3fe6c72cbf2fc96e5eada2dbe
# SHA1:ddea3c4f001a81288c61a968a9ff3788e3cb9dd4
#
# This file is autogenerated by pip-compile-multi
# To update, run:
Expand All @@ -12,7 +12,7 @@
# -r requirements/development.in
appnope==0.1.3
# via ipython
astroid==2.6.6
astroid==2.11.7
# via pylint
asttokens==2.2.1
# via stack-data
Expand All @@ -34,6 +34,8 @@ charset-normalizer==3.1.0
# via requests
decorator==5.1.1
# via ipython
dill==0.3.6
# via pylint
et-xmlfile==1.1.0
# via openpyxl
executing==1.2.0
Expand Down Expand Up @@ -62,7 +64,7 @@ linear-tsv==1.1.0
# via tabulator
matplotlib-inline==0.1.6
# via ipython
mccabe==0.6.1
mccabe==0.7.0
# via pylint
mysqlclient==2.1.0
# via apache-superset
Expand All @@ -76,6 +78,8 @@ pickleshare==0.7.5
# via ipython
pillow==9.5.0
# via apache-superset
platformdirs==3.8.1
# via pylint
progress==1.6
# via -r requirements/development.in
psycopg2-binary==2.9.6
Expand All @@ -98,7 +102,7 @@ pyhive[hive]==0.6.5
# via apache-superset
pyinstrument==4.4.0
# via -r requirements/development.in
pylint==2.9.6
pylint==2.14.5
# via -r requirements/development.in
python-ldap==3.4.3
# via -r requirements/development.in
Expand Down Expand Up @@ -128,7 +132,9 @@ thrift==0.16.0
# thrift-sasl
thrift-sasl==0.4.3
# via pyhive
toml==0.10.2
tomli==2.0.1
# via pylint
tomlkit==0.11.8
# via pylint
traitlets==5.9.0
# via
Expand Down
2 changes: 1 addition & 1 deletion superset/charts/commands/delete.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ def validate(self) -> None:
if reports := ReportScheduleDAO.find_by_chart_ids(self._model_ids):
report_names = [report.name for report in reports]
raise ChartDeleteFailedReportsExistError(
_("There are associated alerts or reports: %s" % ",".join(report_names))
_(f"There are associated alerts or reports: {','.join(report_names)}")
)
# Check ownership
for model in self._models:
Expand Down
1 change: 0 additions & 1 deletion superset/charts/commands/warm_up_cache.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,6 @@ def run(self) -> dict[str, Any]:
force=True,
)

# pylint: disable=assigning-non-slot
g.form_data = form_data
payload = obj.get_payload()
delattr(g, "form_data")
Expand Down
7 changes: 3 additions & 4 deletions superset/charts/data/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@ def get_data(self, pk: int) -> Response:
query_context = self._create_query_context_from_form(json_body)
command = ChartDataCommand(query_context)
command.validate()
except DatasourceNotFound as error:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know some people find Pylint to be somewhat cumbersome, but I love to see Pylint every evolving which results in fixes like this—which typically flake8 et al. miss.

except DatasourceNotFound:
return self.response_404()
except QueryObjectValidationError as error:
return self.response_400(message=error.message)
Expand Down Expand Up @@ -233,7 +233,7 @@ def data(self) -> Response:
query_context = self._create_query_context_from_form(json_body)
command = ChartDataCommand(query_context)
command.validate()
except DatasourceNotFound as error:
except DatasourceNotFound:
return self.response_404()
except QueryObjectValidationError as error:
return self.response_400(message=error.message)
Expand Down Expand Up @@ -420,11 +420,10 @@ def _get_data_response(

return self._send_chart_response(result, form_data, datasource)

# pylint: disable=invalid-name, no-self-use
# pylint: disable=invalid-name
def _load_query_context_form_from_cache(self, cache_key: str) -> dict[str, Any]:
return QueryContextCacheLoader.load(cache_key)

# pylint: disable=no-self-use
def _create_query_context_from_form(
self, form_data: dict[str, Any]
) -> QueryContext:
Expand Down
3 changes: 2 additions & 1 deletion superset/charts/filters.py
Original file line number Diff line number Diff line change
Expand Up @@ -160,7 +160,8 @@ def apply(self, query: Query, value: Any) -> Query:
Slice.id == FavStar.obj_id,
),
isouter=True,
).filter( # pylint: disable=comparison-with-callable
).filter(
# pylint: disable=comparison-with-callable
or_(
Slice.id.in_(owner_ids_query),
Slice.created_by_fk == get_user_id(),
Expand Down
4 changes: 0 additions & 4 deletions superset/cli/importexport.py
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,6 @@ def export_dashboards(dashboard_file: Optional[str] = None) -> None:
from superset.dashboards.commands.export import ExportDashboardsCommand
from superset.models.dashboard import Dashboard

# pylint: disable=assigning-non-slot
g.user = security_manager.find_user(username="admin")

dashboard_ids = [id_ for (id_,) in db.session.query(Dashboard.id).all()]
Expand Down Expand Up @@ -109,7 +108,6 @@ def export_datasources(datasource_file: Optional[str] = None) -> None:
from superset.connectors.sqla.models import SqlaTable
from superset.datasets.commands.export import ExportDatasetsCommand

# pylint: disable=assigning-non-slot
g.user = security_manager.find_user(username="admin")

dataset_ids = [id_ for (id_,) in db.session.query(SqlaTable.id).all()]
Expand Down Expand Up @@ -151,7 +149,6 @@ def import_dashboards(path: str, username: Optional[str]) -> None:
)

if username is not None:
# pylint: disable=assigning-non-slot
g.user = security_manager.find_user(username=username)
if is_zipfile(path):
with ZipFile(path) as bundle:
Expand Down Expand Up @@ -317,7 +314,6 @@ def import_dashboards(path: str, recursive: bool, username: str) -> None:
elif path_object.exists() and recursive:
files.extend(path_object.rglob("*.json"))
if username is not None:
# pylint: disable=assigning-non-slot
g.user = security_manager.find_user(username=username)
contents = {}
for path_ in files:
Expand Down
7 changes: 1 addition & 6 deletions superset/cli/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,12 +71,7 @@ def init() -> None:
def version(verbose: bool) -> None:
"""Prints the current version number"""
print(Fore.BLUE + "-=" * 15)
print(
Fore.YELLOW
+ "Superset "
+ Fore.CYAN
+ "{version}".format(version=app.config["VERSION_STRING"])
)
print(Fore.YELLOW + "Superset " + Fore.CYAN + f"{app.config['VERSION_STRING']}")
print(Fore.BLUE + "-=" * 15)
if verbose:
print("[DB] : " + f"{db.engine}")
Expand Down
2 changes: 1 addition & 1 deletion superset/commands/exceptions.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ def __init__(
super().__init__(
_(
self.message_format.format(
object_type, '"%s" ' % object_id if object_id else ""
object_type, f'"{object_id}" ' if object_id else ""
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just wondering whether the trailing space should be there. I realize it was previously, but I'm not sure if that was a typo.

)
),
exception,
Expand Down
2 changes: 1 addition & 1 deletion superset/commands/importers/v1/examples.py
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ def _get_uuids(cls) -> set[str]:
)

@staticmethod
def _import( # pylint: disable=arguments-differ, too-many-locals, too-many-branches
def _import( # pylint: disable=too-many-locals, too-many-branches
session: Session,
configs: dict[str, Any],
overwrite: bool = False,
Expand Down
1 change: 0 additions & 1 deletion superset/common/query_context_factory.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,6 @@ def create(
cache_values=cache_values,
)

# pylint: disable=no-self-use
def _convert_to_model(self, datasource: DatasourceDict) -> BaseDatasource:
return DatasourceDAO.get_datasource(
session=db.session,
Expand Down
2 changes: 1 addition & 1 deletion superset/common/query_object_factory.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ def _convert_to_model(self, datasource: DatasourceDict) -> BaseDatasource:
session=self._session_maker(),
)

def _process_extras( # pylint: disable=no-self-use
def _process_extras(
self,
extras: dict[str, Any] | None,
) -> dict[str, Any]:
Expand Down
2 changes: 2 additions & 0 deletions superset/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -1147,6 +1147,7 @@ def CSV_TO_HIVE_UPLOAD_DIRECTORY_FUNC( # pylint: disable=invalid-name
# TRACKING_URL_TRANSFORMER = (
# lambda url, query: url if is_fresh(query) else None
# )
# pylint: disable=unnecessary-lambda-assignment
EugeneTorap marked this conversation as resolved.
Show resolved Hide resolved
TRACKING_URL_TRANSFORMER = lambda url: url


Expand Down Expand Up @@ -1466,6 +1467,7 @@ def EMAIL_HEADER_MUTATOR( # pylint: disable=invalid-name,unused-argument
# This can be used to set any properties of the object based on naming
# conventions and such. You can find examples in the tests.

# pylint: disable=unnecessary-lambda-assignment
SQLA_TABLE_MUTATOR = lambda table: table


Expand Down
2 changes: 1 addition & 1 deletion superset/connectors/base/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -589,7 +589,7 @@ def update_from_object(self, obj: dict[str, Any]) -> None:
else []
)

def get_extra_cache_keys( # pylint: disable=no-self-use
def get_extra_cache_keys(
self, query_obj: QueryObjectDict # pylint: disable=unused-argument
) -> list[Hashable]:
"""If a datasource needs to provide additional keys for calculation of
Expand Down
2 changes: 1 addition & 1 deletion superset/connectors/sqla/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -300,7 +300,7 @@ def type_generic(self) -> utils.GenericDataType | None:
return GenericDataType.TEMPORAL

return (
column_spec.generic_type # pylint: disable=used-before-assignment
column_spec.generic_type
if (
column_spec := self.db_engine_spec.get_column_spec(
self.type,
Expand Down
6 changes: 3 additions & 3 deletions superset/connectors/sqla/views.py
Original file line number Diff line number Diff line change
Expand Up @@ -335,7 +335,7 @@ class TableModelView( # pylint: disable=too-many-ancestors
"offset": _("Timezone offset (in hours) for this datasource"),
"table_name": _("Name of the table that exists in the source database"),
"schema": _(
"Schema, as used only in some databases like Postgres, Redshift " "and DB2"
"Schema, as used only in some databases like Postgres, Redshift and DB2"
EugeneTorap marked this conversation as resolved.
Show resolved Hide resolved
),
"description": Markup(
'Supports <a href="https://daringfireball.net/projects/markdown/">'
Expand All @@ -361,7 +361,7 @@ class TableModelView( # pylint: disable=too-many-ancestors
"from the backend on the fly"
),
"is_sqllab_view": _(
"Whether the table was generated by the 'Visualize' flow " "in SQL Lab"
"Whether the table was generated by the 'Visualize' flow in SQL Lab"
),
"template_params": _(
"A set of parameters that become available in the query using "
Expand Down Expand Up @@ -410,7 +410,7 @@ class TableModelView( # pylint: disable=too-many-ancestors
)
}

def post_add( # pylint: disable=arguments-differ
def post_add(
self,
item: "TableModelView",
flash_message: bool = True,
Expand Down
2 changes: 1 addition & 1 deletion superset/daos/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@
from superset.extensions import db
from superset.utils.core import get_iterable

T = TypeVar("T", bound=Model) # pylint: disable=invalid-name
T = TypeVar("T", bound=Model)


class BaseDAO(Generic[T]):
Expand Down
13 changes: 6 additions & 7 deletions superset/dashboards/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -284,10 +284,9 @@ def ensure_thumbnails_enabled(self) -> Optional[Response]:

def __repr__(self) -> str:
"""Deterministic string representation of the API instance for etag_cache."""
return "Superset.dashboards.api.DashboardRestApi@v{}{}".format(
EugeneTorap marked this conversation as resolved.
Show resolved Hide resolved
self.appbuilder.app.config["VERSION_STRING"],
self.appbuilder.app.config["VERSION_SHA"],
)
version_str = self.appbuilder.app.config["VERSION_STRING"]
version_sha = self.appbuilder.app.config["VERSION_SHA"]
return f"Superset.dashboards.api.DashboardRestApi@v{version_str}{version_sha}"

@expose("/<id_or_slug>", methods=("GET",))
@protect()
Expand All @@ -305,7 +304,7 @@ def __repr__(self) -> str:
@statsd_metrics
@with_dashboard
@event_logger.log_this_with_extra_payload
# pylint: disable=arguments-differ
# pylint: disable=arguments-differ,arguments-renamed
def get(
self,
dash: Dashboard,
Expand Down Expand Up @@ -756,8 +755,8 @@ def bulk_delete(self, **kwargs: Any) -> Response:
@event_logger.log_this_with_context(
action=lambda self, *args, **kwargs: f"{self.__class__.__name__}.export",
log_to_statsd=False,
) # pylint: disable=too-many-locals
def export(self, **kwargs: Any) -> Response:
)
def export(self, **kwargs: Any) -> Response: # pylint: disable=too-many-locals
"""Export dashboards
---
get:
Expand Down
2 changes: 1 addition & 1 deletion superset/dashboards/commands/delete.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ def validate(self) -> None:
if reports := ReportScheduleDAO.find_by_dashboard_ids(self._model_ids):
report_names = [report.name for report in reports]
raise DashboardDeleteFailedReportsExistError(
_("There are associated alerts or reports: %s" % ",".join(report_names))
_(f"There are associated alerts or reports: {','.join(report_names)}")
)
# Check ownership
for model in self._models:
Expand Down
3 changes: 1 addition & 2 deletions superset/dashboards/filter_sets/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,6 @@ def __init__(self) -> None:
super().__init__()

def _init_properties(self) -> None:
# pylint: disable=bad-super-call
super(BaseSupersetModelRestApi, self)._init_properties()

@expose("/<int:dashboard_id>/filtersets", methods=("GET",))
Expand Down Expand Up @@ -181,7 +180,7 @@ def get_list(self, dashboard_id: int, **kwargs: Any) -> Response:
$ref: '#/components/responses/404'
"""
if not DashboardDAO.find_by_id(cast(int, dashboard_id)):
return self.response(404, message="dashboard '%s' not found" % dashboard_id)
return self.response(404, message=f"dashboard '{dashboard_id}' not found")
rison_data = kwargs.setdefault("rison", {})
rison_data.setdefault("filters", [])
rison_data["filters"].append(
Expand Down
3 changes: 1 addition & 2 deletions superset/dashboards/filter_sets/commands/delete.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,6 @@ def validate(self) -> None:
except FilterSetNotFoundError as err:
if FilterSetDAO.find_by_id(self._filter_set_id): # type: ignore
raise FilterSetForbiddenError(
'the filter-set does not related to dashboard "%s"'
% str(self._dashboard_id)
f"the filter-set does not related to dashboard {self._dashboard_id}"
) from err
raise err
3 changes: 1 addition & 2 deletions superset/dashboards/filter_sets/schemas.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,6 @@ def _validate_json_meta_data(self, json_meta_data: str) -> None:

class FilterSetPostSchema(FilterSetSchema):
json_metadata_schema: JsonMetadataSchema = JsonMetadataSchema()
# pylint: disable=W0613
name = fields.String(
required=True,
allow_none=False,
Expand Down Expand Up @@ -83,7 +82,7 @@ class FilterSetPutSchema(FilterSetSchema):
)

@post_load
def validate( # pylint: disable=unused-argument
def validate(
self, data: Mapping[Any, Any], *, many: Any, partial: Any
) -> dict[str, Any]:
if JSON_METADATA_FIELD in data:
Expand Down
Loading