Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ repos:
- id: end-of-file-fixer
- id: trailing-whitespace
- repo: https://github.com/tox-dev/pyproject-fmt
rev: v2.16.2
rev: v2.21.0
hooks:
- id: pyproject-fmt
- repo: https://github.com/abravalheri/validate-pyproject
Expand All @@ -44,7 +44,7 @@ repos:
entry: isort --profile=black
name: isort (python)
- repo: https://github.com/psf/black-pre-commit-mirror
rev: 26.1.0
rev: 26.3.1
hooks:
- id: black
- repo: https://github.com/tonybaloney/perflint
Expand All @@ -60,7 +60,7 @@ repos:
additional_dependencies:
- black
- repo: https://github.com/codespell-project/codespell
rev: v2.4.1
rev: v2.4.2
hooks:
- id: codespell
args: [--toml pyproject.toml]
2 changes: 0 additions & 2 deletions docs/data-processing/apis/fastapi/example.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@ Create a file :file:`main.py` with:

from fastapi import FastAPI


app = FastAPI()


Expand Down Expand Up @@ -77,7 +76,6 @@ Now we modify the file ``main.py`` to receive a body from a ``PUT`` request:

from fastapi import FastAPI


app = FastAPI()


Expand Down
6 changes: 2 additions & 4 deletions docs/data-processing/postgresql/db-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -82,12 +82,10 @@ Cursor
.. code-block:: python

cursor = conn.cursor()
cursor.execute(
"""
cursor.execute("""
SELECT column1, column2
FROM tableA
"""
)
""")
for column1, column2 in cursor.fetchall():
print(column1, column2)

Expand Down
2 changes: 0 additions & 2 deletions docs/data-processing/postgresql/sqlalchemy.rst
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,6 @@ Database connection

from sqlalchemy import create_engine


engine = create_engine("postgresql:///example", echo=True)

Data model
Expand All @@ -56,7 +55,6 @@ Data model
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import relationship


Base = declarative_base()


Expand Down
1 change: 0 additions & 1 deletion docs/data-processing/serialisation-formats/toml/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,6 @@ Overview

import toml


config = toml.load("pyproject.toml")

.. seealso::
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
# SPDX-FileCopyrightText: 2022 Veit Schiele
#
# SPDX-License-Identifier: BSD-3-Clause

[tool.black]
line-length = 79

Expand All @@ -14,6 +13,5 @@ lines_between_types = 1
multi_line_output = 3
not_skip = "__init__.py"
use_parentheses = true

known_first_party = [ "MY_FIRST_MODULE", "MY_SECOND_MODULE" ]
known_third_party = [ "mpi4py", "numpy", "requests" ]
3 changes: 0 additions & 3 deletions docs/performance/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,6 @@ We can create sample data with:

from sklearn.datasets import make_blobs


points, labels_true = make_blobs(
n_samples=1000, centers=3, random_state=0, cluster_std=0.60
)
Expand Down Expand Up @@ -133,7 +132,6 @@ algorithm:

from sklearn.cluster import KMeans


KMeans(10).fit_predict(points)

* `dask_ml.cluster.KMeans
Expand All @@ -143,7 +141,6 @@ algorithm:

from dask_ml.cluster import KMeans


KMeans(10).fit(points).predict(points)

The best that could be said against these existing solutions is that they could
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ classifiers = [
]
dependencies = []
urls."Bug Tracker" = "https://github.com/cusyio/Python4DataScience/issues"
urls."Homepage" = "https://github.com/cusyio/Python4DataScience/"
urls.Homepage = "https://github.com/cusyio/Python4DataScience/"

[dependency-groups]
dev = [
Expand Down
Loading