- `DistanceMatrix.from_addresses()` takes a variable number of
`Address` objects and creates distance matrix entries for them
- as a base measure, the air distance between two `Address`
objects is calculated
- in addition, an integration with the Google Maps Directions API is
implemented that provides a more realistic measure of the distance
and duration a rider on a bicycle would need to travel between two
`Address` objects
- add a `Location.lat_lng` convenience property that provides the
`.latitude` and `.longitude` of an `Address` as a 2-`tuple`
- the model applies a simple moving average on horizontal time series
- refactor `db.Forecast.from_dataframe()` to correctly convert
`float('NaN')` values into `None`; otherwise, SQLAlchemy complains
- the function implements a forecasting "method" similar to the
seasonal naive method
=> instead of simply taking the last observation given a seasonal lag,
it linearly extrapolates all observations of the same seasonal lag
from the past into the future; conceptually, it is like the
seasonal naive method with built-in smoothing
- the function is tested just like the `arima.predict()` and
`ets.predict()` functions
+ rename the `tests.forecasts.methods.test_ts_methods` module
into `tests.forecasts.methods.test_predictions`
- re-organize some constants in the `tests` package
- streamline some docstrings
- move the module
- unify the corresponding tests in `tests.forecasts.methods` sub-package
- make all `predict()` and the `stl()` function round results
- streamline documentation
- ensure a `Restaurant` only has one unique `Order.pickup_address`
- rework `Grid.gridify()` so that only pickup addresses are assigned
into `Pixel`s
- include database migrations to ensure the data adhere to these
tighter constraints
- add a Jupyter notebook that allows to install all project-external
dependencies regarding R and R packages
- adjust the GitHub Action workflow to also install R and the R packages
used within the project
- add a `init_r` module that initializes all R packages globally
once the `urban_meal_delivery` package is imported
- the function queries the database and aggregates the ad-hoc orders
by pixel and time steps into a demand time series
- implement "heavy" integration tests for `aggregate_orders()`
- make `pandas` a package dependency
- streamline the `Config`
- the purpose of this constructor method is to generate all `Pixel`s
for a `Grid` that have at least one `Address` assigned to them
- fix missing `UniqueConstraint` in `Grid` class => it was not possible
to create two `Grid`s with the same `.side_length` in different cities
- change the `City.viewport` property into two separate `City.southwest`
and `City.northeast` properties; also add `City.total_x` and
`City.total_y` properties for convenience
- the old `UTMCoordinate` class becomes the new `Location` class
- its main purpose is to represent locations in both lat-long
coordinates as well as in the UTM system
- remove `Address.__init__()` and `City.__init__()` methods as they
are not executed for entries retrieved from the database
- simplfiy the `Location.__init__()` => remove `relative_to` argument
- remove the factory functions for creating engines and sessions
- define global engine, connection, and session objects to be used
everywhere in the urban_meal_delivery package
- add Grid, Pixel, and AddressPixelAssociation ORM models
- each Grid belongs to a City an is characterized by the side_length
of all the square Pixels contained in it
- Pixels aggregate Addresses => many-to-many relationship (that is
modeled with SQLAlchemy's Association Pattern to implement a couple
of constraints)
- the Address.x and Address.y properties use the UTMCoordinate class
behind the scenes
- x and y are simple coordinates in an x-y plane
- the (0, 0) origin is the southwest corner of Address.city.viewport
- the class is a utility to abstract working with latitude-longitude
coordinates in their UTM representation (~ "cartesian plane")
- the class's .x and .y properties enable working with simple x-y
coordinates where the (0, 0) origin is the lower-left of a city's
viewport
- create `*Factory` classes with fakerboy and faker that generate
randomized instances of the ORM models
- add new pytest marker: "db" are the integration tests involving the
database whereas "e2e" will be all other integration tests
- streamline the docstrings in the ORM models
- many *.py and *.ipynb files will contain links to resources on
GitHub or nbviewer that have branch references in them
- add a pre-commit hook implemented as the nox session
"fix-branch-references" that goes through these files and
changes all the branch labels to the current one
- use Alembic to migrate the PostgreSQL database
+ create initial migration script to set up the database,
as an alternative to db.Base.metadata.create_all()
+ integrate Alembic into the test suite; the db_engine fixture
now has two modes:
* create the latest version of tables all at once
* invoke `alembic upgrade head`
=> the "e2e" tests are all run twice, once in each mode; this
ensures that the migration scripts re-create the same database
schema as db.Base.metadata.create_all() would
* in both modes, a temporary PostgreSQL schema is used to create the
tables in
=> could now run "e2e" tests against production database and still
have isolation
- make the configuration module public (to be used by Alembic)
- adjust linting rules for Alembic
- use SQLAlchemy (and PostgreSQL) to model the ORM layer
- add the following models:
+ Address => modelling all kinds of addresses
+ City => model the three target cities
+ Courier => model the UDP's couriers
+ Customer => model the UDP's customers
+ Order => model the orders received by the UDP
+ Restaurant => model the restaurants active on the UDP
- so far, the emphasis lies on expression the Foreign Key
and Check Constraints that are used to validate the assumptions
inherent to the cleanded data
- provide database-independent unit tests with 100% coverage
- provide additional integration tests ("e2e") that commit data to
a PostgreSQL instance to validate that the constraints work
- adapt linting rules a bit
- add the following file:
+ src/urban_meal_delivery/_config.py
- a config module is created holding two sets of configurations:
+ production => against the real database
+ testing => against a database with test data
- the module is "protected" (i.e., underscore) and imported at the
top level via a proxy-like object `config` that detects in which of
the two environments the package is being run
- use sphinx to document the developed package
- create a nox session "docs" that builds the docs
- include a skeleton in the docs/ folder
+ how to install the package
+ how to use nox
+ license
- for tests/ and the noxfile.py, type annotations are not strictly
enforced any more
+ this simplifies the way test cases and nox sessions are written
+ for many pytest fixtures, no types are available via a public API
- put fixtures inside the classes the corresponding test cases are
grouped in
- add the following file:
+ src/urban_meal_delivery/console.py => click-based CLI tools
+ tests/test_console.py => tests for the module above
- add a CLI entry point `umd`:
+ implement the --version / -V option
to show the installed package's version
+ rework to package's top-level:
* add a __pkg_name__ variable to parameterize the package name
+ add unit and integration tests
- fix that pylint cannot know the proper order of imports in the
isolated nox session
- use pytest as the base, measure coverage with pytest-cov
+ configure coverage to include branches and specify source locations
+ configure pytest to enforce explicit markers
- add a package for the test suite under tests/
- add a `__version__` identifier at the package's root
+ it is dynamically assigned the version of the installed package
+ the version is PEP440 compliant and follows a strict subset of
semantic versioning: x.y.z[.devN] where x, y, z, and N are all
non-negative integers
+ add module with tests for the __version__
- add a nox session "test" that runs the test suite
- use flake8 to lint pytest for consistent style
- use flake8 as the main and pylint as the auxiliary linter
- install flake8 with the following plug-ins:
+ flake8-annotations => enforce type annotations for functions/classes
+ flake8-black => ensure black would not make any changes
+ flake8-expression-complexity
+ wemake-python-styleguide, which packages the following:
* darglint * flake8-bandit * flake8-broken-line
* flake8-bugbear * flake8-commas * flake8-comprehensions
* flake8-debugger * flake8-docstrings * flake8-eradicate
* flake8-isort * flake8-rst-docstrings * flake8-string-format
* flake8-quotes * pep8-naming
- configure flake8 & friends in a rather explicit and strict way
- isort needed to be downgraded to ^4.3.21 due to a conflict with
pylint and wemake-python-styleguide:
+ provide TODO's to remove the parts that "fix" isort
- use mypy for static type checking
- add a nox session "lint" that runs flake8, mypy, and pylint
- lint all source files
- (auto-)format code with:
+ autoflake => * remove unused imports and variables
* remove duplicate dict keys
* expand star imports
+ black => enforce an uncompromising code style
+ isort => enforce a consistent import style
(complying with Google's Python Style Guide)
- implement the nox session "format" that runs all these tools
- add the following file:
+ setup.cfg => holds configurations for the develop tools