Compare commits
No commits in common. "main" and "v0.3.0" have entirely different histories.
56 changed files with 1048 additions and 621475 deletions
1
.gitignore
vendored
1
.gitignore
vendored
|
|
@ -1,7 +1,6 @@
|
||||||
.cache/
|
.cache/
|
||||||
**/*.egg-info/
|
**/*.egg-info/
|
||||||
.env
|
.env
|
||||||
.idea/
|
|
||||||
**/.ipynb_checkpoints/
|
**/.ipynb_checkpoints/
|
||||||
.python-version
|
.python-version
|
||||||
.venv/
|
.venv/
|
||||||
|
|
|
||||||
34
README.md
34
README.md
|
|
@ -16,39 +16,19 @@ that iteratively build on each other.
|
||||||
### Data Cleaning
|
### Data Cleaning
|
||||||
|
|
||||||
The UDP provided its raw data as a PostgreSQL dump.
|
The UDP provided its raw data as a PostgreSQL dump.
|
||||||
This [notebook](https://nbviewer.jupyter.org/github/webartifex/urban-meal-delivery/blob/main/research/01_clean_data.ipynb)
|
This [notebook](https://nbviewer.jupyter.org/github/webartifex/urban-meal-delivery/blob/develop/research/clean_data.ipynb)
|
||||||
cleans the data extensively
|
cleans the data extensively
|
||||||
and maps them onto the [ORM models](https://github.com/webartifex/urban-meal-delivery/tree/main/src/urban_meal_delivery/db)
|
and maps them onto the [ORM models](https://github.com/webartifex/urban-meal-delivery/tree/develop/src/urban_meal_delivery/db)
|
||||||
defined in the `urban-meal-delivery` package
|
defined in the `urban-meal-delivery` package
|
||||||
that is developed in the [src/](https://github.com/webartifex/urban-meal-delivery/tree/main/src) folder
|
that is developed in the [src/](https://github.com/webartifex/urban-meal-delivery/tree/develop/src) folder
|
||||||
and contains all source code to drive the analyses.
|
and contains all source code to drive the analyses.
|
||||||
|
|
||||||
Due to a non-disclosure agreement with the UDP,
|
Due to a non-disclosure agreement with the UDP,
|
||||||
neither the raw nor the cleaned data are published as of now.
|
neither the raw nor the cleaned data are published as of now.
|
||||||
However, previews of the data can be seen throughout the [research/](https://github.com/webartifex/urban-meal-delivery/tree/main/research) folder.
|
However, previews of the data can be seen throughout the [research/](https://github.com/webartifex/urban-meal-delivery/tree/develop/research) folder.
|
||||||
|
|
||||||
|
|
||||||
### Tactical Demand Forecasting
|
### Real-time Demand Forecasting
|
||||||
|
|
||||||
Before any optimizations of the UDP's operations are done,
|
|
||||||
a **demand forecasting** system for *tactical* purposes is implemented.
|
|
||||||
To achieve that, the cities first undergo a **gridification** step
|
|
||||||
where each *pickup* location is assigned into a pixel on a "checker board"-like grid.
|
|
||||||
The main part of the source code that implements that is in this [file](https://github.com/webartifex/urban-meal-delivery/blob/main/src/urban_meal_delivery/db/grids.py#L60).
|
|
||||||
Visualizations of the various grids can be found in the [visualizations/](https://github.com/webartifex/urban-meal-delivery/tree/main/research/visualizations) folder
|
|
||||||
and in this [notebook](https://nbviewer.jupyter.org/github/webartifex/urban-meal-delivery/blob/main/research/03_grid_visualizations.ipynb).
|
|
||||||
|
|
||||||
Then, demand is aggregated on a per-pixel level
|
|
||||||
and different kinds of order time series are generated.
|
|
||||||
The latter are the input to different kinds of forecasting `*Model`s.
|
|
||||||
They all have in common that they predict demand into the *short-term* future (e.g., one hour)
|
|
||||||
and are thus used for tactical purposes, in particular predictive routing (cf., next section).
|
|
||||||
The details of how this works can be found in the first academic paper
|
|
||||||
published in the context of this research project
|
|
||||||
and titled "*Real-time Demand Forecasting for an Urban Delivery Platform*"
|
|
||||||
(cf., the [repository](https://github.com/webartifex/urban-meal-delivery-demand-forecasting) with the LaTeX files).
|
|
||||||
All demand forecasting related code is in the [forecasts/](https://github.com/webartifex/urban-meal-delivery/tree/main/src/urban_meal_delivery/forecasts) sub-package.
|
|
||||||
|
|
||||||
|
|
||||||
### Predictive Routing
|
### Predictive Routing
|
||||||
|
|
||||||
|
|
@ -71,11 +51,11 @@ and
|
||||||
`poetry install --extras research`
|
`poetry install --extras research`
|
||||||
|
|
||||||
The `--extras` option is necessary as the non-develop dependencies
|
The `--extras` option is necessary as the non-develop dependencies
|
||||||
are structured in the [pyproject.toml](https://github.com/webartifex/urban-meal-delivery/blob/main/pyproject.toml) file
|
are structured in the [pyproject.toml](https://github.com/webartifex/urban-meal-delivery/blob/develop/pyproject.toml) file
|
||||||
into dependencies related to only the `urban-meal-delivery` source code package
|
into dependencies related to only the `urban-meal-delivery` source code package
|
||||||
and dependencies used to run the [Jupyter](https://jupyter.org/) environment
|
and dependencies used to run the [Jupyter](https://jupyter.org/) environment
|
||||||
with the analyses.
|
with the analyses.
|
||||||
|
|
||||||
Contributions are welcome.
|
Contributions are welcome.
|
||||||
Use the [issues](https://github.com/webartifex/urban-meal-delivery/issues) tab.
|
Use the [issues](https://github.com/webartifex/urban-meal-delivery/issues) tab.
|
||||||
The project is licensed under the [MIT license](https://github.com/webartifex/urban-meal-delivery/blob/main/LICENSE.txt).
|
The project is licensed under the [MIT license](https://github.com/webartifex/urban-meal-delivery/blob/develop/LICENSE.txt).
|
||||||
|
|
|
||||||
|
|
@ -1,96 +0,0 @@
|
||||||
"""Add distance matrix.
|
|
||||||
|
|
||||||
Revision: #b4dd0b8903a5 at 2021-03-01 16:14:06
|
|
||||||
Revises: #8bfb928a31f8
|
|
||||||
"""
|
|
||||||
|
|
||||||
import os
|
|
||||||
|
|
||||||
import sqlalchemy as sa
|
|
||||||
from alembic import op
|
|
||||||
from sqlalchemy.dialects import postgresql
|
|
||||||
|
|
||||||
from urban_meal_delivery import configuration
|
|
||||||
|
|
||||||
|
|
||||||
revision = 'b4dd0b8903a5'
|
|
||||||
down_revision = '8bfb928a31f8'
|
|
||||||
branch_labels = None
|
|
||||||
depends_on = None
|
|
||||||
|
|
||||||
|
|
||||||
config = configuration.make_config('testing' if os.getenv('TESTING') else 'production')
|
|
||||||
|
|
||||||
|
|
||||||
def upgrade():
|
|
||||||
"""Upgrade to revision b4dd0b8903a5."""
|
|
||||||
op.create_table(
|
|
||||||
'addresses_addresses',
|
|
||||||
sa.Column('first_address_id', sa.Integer(), nullable=False),
|
|
||||||
sa.Column('second_address_id', sa.Integer(), nullable=False),
|
|
||||||
sa.Column('city_id', sa.SmallInteger(), nullable=False),
|
|
||||||
sa.Column('air_distance', sa.Integer(), nullable=False),
|
|
||||||
sa.Column('bicycle_distance', sa.Integer(), nullable=True),
|
|
||||||
sa.Column('bicycle_duration', sa.Integer(), nullable=True),
|
|
||||||
sa.Column('directions', postgresql.JSON(), nullable=True),
|
|
||||||
sa.PrimaryKeyConstraint(
|
|
||||||
'first_address_id',
|
|
||||||
'second_address_id',
|
|
||||||
name=op.f('pk_addresses_addresses'),
|
|
||||||
),
|
|
||||||
sa.ForeignKeyConstraint(
|
|
||||||
['first_address_id', 'city_id'],
|
|
||||||
[
|
|
||||||
f'{config.CLEAN_SCHEMA}.addresses.id',
|
|
||||||
f'{config.CLEAN_SCHEMA}.addresses.city_id',
|
|
||||||
],
|
|
||||||
name=op.f(
|
|
||||||
'fk_addresses_addresses_to_addresses_via_first_address_id_city_id',
|
|
||||||
),
|
|
||||||
onupdate='RESTRICT',
|
|
||||||
ondelete='RESTRICT',
|
|
||||||
),
|
|
||||||
sa.ForeignKeyConstraint(
|
|
||||||
['second_address_id', 'city_id'],
|
|
||||||
[
|
|
||||||
f'{config.CLEAN_SCHEMA}.addresses.id',
|
|
||||||
f'{config.CLEAN_SCHEMA}.addresses.city_id',
|
|
||||||
],
|
|
||||||
name=op.f(
|
|
||||||
'fk_addresses_addresses_to_addresses_via_second_address_id_city_id',
|
|
||||||
),
|
|
||||||
onupdate='RESTRICT',
|
|
||||||
ondelete='RESTRICT',
|
|
||||||
),
|
|
||||||
sa.UniqueConstraint(
|
|
||||||
'first_address_id',
|
|
||||||
'second_address_id',
|
|
||||||
name=op.f('uq_addresses_addresses_on_first_address_id_second_address_id'),
|
|
||||||
),
|
|
||||||
sa.CheckConstraint(
|
|
||||||
'first_address_id < second_address_id',
|
|
||||||
name=op.f('ck_addresses_addresses_on_distances_are_symmetric_for_bicycles'),
|
|
||||||
),
|
|
||||||
sa.CheckConstraint(
|
|
||||||
'0 <= air_distance AND air_distance < 20000',
|
|
||||||
name=op.f('ck_addresses_addresses_on_realistic_air_distance'),
|
|
||||||
),
|
|
||||||
sa.CheckConstraint(
|
|
||||||
'bicycle_distance < 25000',
|
|
||||||
name=op.f('ck_addresses_addresses_on_realistic_bicycle_distance'),
|
|
||||||
),
|
|
||||||
sa.CheckConstraint(
|
|
||||||
'air_distance <= bicycle_distance',
|
|
||||||
name=op.f('ck_addresses_addresses_on_air_distance_is_shortest'),
|
|
||||||
),
|
|
||||||
sa.CheckConstraint(
|
|
||||||
'0 <= bicycle_duration AND bicycle_duration <= 3600',
|
|
||||||
name=op.f('ck_addresses_addresses_on_realistic_bicycle_travel_time'),
|
|
||||||
),
|
|
||||||
schema=config.CLEAN_SCHEMA,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def downgrade():
|
|
||||||
"""Downgrade to revision 8bfb928a31f8."""
|
|
||||||
op.drop_table('addresses_addresses', schema=config.CLEAN_SCHEMA)
|
|
||||||
|
|
@ -135,6 +135,7 @@ def lint(session):
|
||||||
'flake8',
|
'flake8',
|
||||||
'flake8-annotations',
|
'flake8-annotations',
|
||||||
'flake8-black',
|
'flake8-black',
|
||||||
|
'flake8-expression-complexity',
|
||||||
'flake8-pytest-style',
|
'flake8-pytest-style',
|
||||||
'mypy',
|
'mypy',
|
||||||
'wemake-python-styleguide',
|
'wemake-python-styleguide',
|
||||||
|
|
@ -196,6 +197,7 @@ def test(session):
|
||||||
'pytest-cov',
|
'pytest-cov',
|
||||||
'pytest-env',
|
'pytest-env',
|
||||||
'pytest-mock',
|
'pytest-mock',
|
||||||
|
'pytest-randomly',
|
||||||
'xdoctest[optional]',
|
'xdoctest[optional]',
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
@ -205,7 +207,7 @@ def test(session):
|
||||||
# test cases that require the slow installation of R and some packages.
|
# test cases that require the slow installation of R and some packages.
|
||||||
if session.env.get('_slow_ci_tests'):
|
if session.env.get('_slow_ci_tests'):
|
||||||
session.run(
|
session.run(
|
||||||
'pytest', '-m', 'r and not db', PYTEST_LOCATION,
|
'pytest', '--randomly-seed=4287', '-m', 'r and not db', PYTEST_LOCATION,
|
||||||
)
|
)
|
||||||
|
|
||||||
# In the "ci-tests-slow" session, we do not run any test tool
|
# In the "ci-tests-slow" session, we do not run any test tool
|
||||||
|
|
@ -217,6 +219,7 @@ def test(session):
|
||||||
# Therefore, the CI server does not measure coverage.
|
# Therefore, the CI server does not measure coverage.
|
||||||
elif session.env.get('_fast_ci_tests'):
|
elif session.env.get('_fast_ci_tests'):
|
||||||
pytest_args = (
|
pytest_args = (
|
||||||
|
'--randomly-seed=4287',
|
||||||
'-m',
|
'-m',
|
||||||
'not (db or r)',
|
'not (db or r)',
|
||||||
PYTEST_LOCATION,
|
PYTEST_LOCATION,
|
||||||
|
|
@ -232,6 +235,7 @@ def test(session):
|
||||||
'--cov-branch',
|
'--cov-branch',
|
||||||
'--cov-fail-under=100',
|
'--cov-fail-under=100',
|
||||||
'--cov-report=term-missing:skip-covered',
|
'--cov-report=term-missing:skip-covered',
|
||||||
|
'--randomly-seed=4287',
|
||||||
PYTEST_LOCATION,
|
PYTEST_LOCATION,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
||||||
2023
poetry.lock
generated
2023
poetry.lock
generated
File diff suppressed because it is too large
Load diff
|
|
@ -9,7 +9,7 @@ target-version = ["py38"]
|
||||||
|
|
||||||
[tool.poetry]
|
[tool.poetry]
|
||||||
name = "urban-meal-delivery"
|
name = "urban-meal-delivery"
|
||||||
version = "0.4.0"
|
version = "0.3.0"
|
||||||
|
|
||||||
authors = ["Alexander Hess <alexander@webartifex.biz>"]
|
authors = ["Alexander Hess <alexander@webartifex.biz>"]
|
||||||
description = "Optimizing an urban meal delivery platform"
|
description = "Optimizing an urban meal delivery platform"
|
||||||
|
|
@ -32,10 +32,7 @@ Shapely = "^1.7.1"
|
||||||
alembic = "^1.4.2"
|
alembic = "^1.4.2"
|
||||||
click = "^7.1.2"
|
click = "^7.1.2"
|
||||||
folium = "^0.12.1"
|
folium = "^0.12.1"
|
||||||
geopy = "^2.1.0"
|
|
||||||
googlemaps = "^4.4.2"
|
|
||||||
matplotlib = "^3.3.3"
|
matplotlib = "^3.3.3"
|
||||||
ordered-set = "^4.0.2"
|
|
||||||
pandas = "^1.1.0"
|
pandas = "^1.1.0"
|
||||||
psycopg2 = "^2.8.5" # adapter for PostgreSQL
|
psycopg2 = "^2.8.5" # adapter for PostgreSQL
|
||||||
rpy2 = "^3.4.1"
|
rpy2 = "^3.4.1"
|
||||||
|
|
@ -72,6 +69,7 @@ isort = "^4.3.21" # TODO (isort): not ^5.5.4 due to wemake-python-styleguide
|
||||||
flake8 = "^3.8.3"
|
flake8 = "^3.8.3"
|
||||||
flake8-annotations = "^2.3.0"
|
flake8-annotations = "^2.3.0"
|
||||||
flake8-black = "^0.2.1"
|
flake8-black = "^0.2.1"
|
||||||
|
flake8-expression-complexity = "^0.0.8"
|
||||||
flake8-pytest-style = "^1.2.2"
|
flake8-pytest-style = "^1.2.2"
|
||||||
mypy = "^0.782"
|
mypy = "^0.782"
|
||||||
wemake-python-styleguide = "^0.14.1" # flake8 plug-in
|
wemake-python-styleguide = "^0.14.1" # flake8 plug-in
|
||||||
|
|
@ -85,6 +83,7 @@ pytest = "^6.0.1"
|
||||||
pytest-cov = "^2.10.0"
|
pytest-cov = "^2.10.0"
|
||||||
pytest-env = "^0.6.2"
|
pytest-env = "^0.6.2"
|
||||||
pytest-mock = "^3.5.1"
|
pytest-mock = "^3.5.1"
|
||||||
|
pytest-randomly = "^3.5.0"
|
||||||
xdoctest = { version="^0.13.0", extras=["optional"] }
|
xdoctest = { version="^0.13.0", extras=["optional"] }
|
||||||
|
|
||||||
# Documentation
|
# Documentation
|
||||||
|
|
|
||||||
|
|
@ -1,168 +0,0 @@
|
||||||
{
|
|
||||||
"cells": [
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"# Gridification"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"This notebook runs the gridification script and creates all the pixels in the database."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": 1,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [
|
|
||||||
{
|
|
||||||
"name": "stdout",
|
|
||||||
"output_type": "stream",
|
|
||||||
"text": [
|
|
||||||
"\u001b[32murban-meal-delivery\u001b[0m, version \u001b[34m0.3.0\u001b[0m\n"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"source": [
|
|
||||||
"!umd --version"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"### Upgrade Database Schema"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"This database migration also de-duplicates redundant addresses and removes obvious outliers."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": 2,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"%cd -q .."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": 3,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [
|
|
||||||
{
|
|
||||||
"name": "stdout",
|
|
||||||
"output_type": "stream",
|
|
||||||
"text": [
|
|
||||||
"INFO [alembic.runtime.migration] Context impl PostgresqlImpl.\n",
|
|
||||||
"INFO [alembic.runtime.migration] Will assume transactional DDL.\n",
|
|
||||||
"INFO [alembic.runtime.migration] Running upgrade f11cd76d2f45 -> 888e352d7526, Add pixel grid.\n",
|
|
||||||
"INFO [alembic.runtime.migration] Running upgrade 888e352d7526 -> e40623e10405, Add demand forecasting.\n",
|
|
||||||
"INFO [alembic.runtime.migration] Running upgrade e40623e10405 -> 26711cd3f9b9, Add confidence intervals to forecasts.\n",
|
|
||||||
"INFO [alembic.runtime.migration] Running upgrade 26711cd3f9b9 -> e86290e7305e, Remove orders from restaurants with invalid location ...\n"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"source": [
|
|
||||||
"!alembic upgrade e86290e7305e"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"### Create the Grids"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"Put all restaurant locations in pixels."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": 4,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [
|
|
||||||
{
|
|
||||||
"name": "stdout",
|
|
||||||
"output_type": "stream",
|
|
||||||
"text": [
|
|
||||||
"3 cities retrieved from the database\n",
|
|
||||||
"\n",
|
|
||||||
"Creating grids for Lyon\n",
|
|
||||||
"Creating grid with a side length of 707 meters\n",
|
|
||||||
" -> created 62 pixels\n",
|
|
||||||
"Creating grid with a side length of 1000 meters\n",
|
|
||||||
" -> created 38 pixels\n",
|
|
||||||
"Creating grid with a side length of 1414 meters\n",
|
|
||||||
" -> created 24 pixels\n",
|
|
||||||
"=> assigned 358 out of 48058 addresses in Lyon\n",
|
|
||||||
"\n",
|
|
||||||
"Creating grids for Paris\n",
|
|
||||||
"Creating grid with a side length of 707 meters\n",
|
|
||||||
" -> created 199 pixels\n",
|
|
||||||
"Creating grid with a side length of 1000 meters\n",
|
|
||||||
" -> created 111 pixels\n",
|
|
||||||
"Creating grid with a side length of 1414 meters\n",
|
|
||||||
" -> created 66 pixels\n",
|
|
||||||
"=> assigned 1133 out of 108135 addresses in Paris\n",
|
|
||||||
"\n",
|
|
||||||
"Creating grids for Bordeaux\n",
|
|
||||||
"Creating grid with a side length of 707 meters\n",
|
|
||||||
" -> created 30 pixels\n",
|
|
||||||
"Creating grid with a side length of 1000 meters\n",
|
|
||||||
" -> created 22 pixels\n",
|
|
||||||
"Creating grid with a side length of 1414 meters\n",
|
|
||||||
" -> created 15 pixels\n",
|
|
||||||
"=> assigned 123 out of 21742 addresses in Bordeaux\n"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"source": [
|
|
||||||
"!umd gridify"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": 5,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"%cd -q research"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"metadata": {
|
|
||||||
"kernelspec": {
|
|
||||||
"display_name": "Python 3",
|
|
||||||
"language": "python",
|
|
||||||
"name": "python3"
|
|
||||||
},
|
|
||||||
"language_info": {
|
|
||||||
"codemirror_mode": {
|
|
||||||
"name": "ipython",
|
|
||||||
"version": 3
|
|
||||||
},
|
|
||||||
"file_extension": ".py",
|
|
||||||
"mimetype": "text/x-python",
|
|
||||||
"name": "python",
|
|
||||||
"nbconvert_exporter": "python",
|
|
||||||
"pygments_lexer": "ipython3",
|
|
||||||
"version": "3.8.6"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"nbformat": 4,
|
|
||||||
"nbformat_minor": 4
|
|
||||||
}
|
|
||||||
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because it is too large
Load diff
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
|
|
@ -19,7 +19,7 @@
|
||||||
"- numeric columns are checked for plausibility\n",
|
"- numeric columns are checked for plausibility\n",
|
||||||
"- foreign key relationships are strictly enforced\n",
|
"- foreign key relationships are strictly enforced\n",
|
||||||
"\n",
|
"\n",
|
||||||
"The structure of the data can be viewed at the [ORM layer](https://github.com/webartifex/urban-meal-delivery/tree/main/src/urban_meal_delivery/db) in the package."
|
"The structure of the data can be viewed at the [ORM layer](https://github.com/webartifex/urban-meal-delivery/tree/develop/src/urban_meal_delivery/db) in the package."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
|
@ -192,7 +192,7 @@
|
||||||
"source": [
|
"source": [
|
||||||
"%cd -q ..\n",
|
"%cd -q ..\n",
|
||||||
"!alembic upgrade f11cd76d2f45\n",
|
"!alembic upgrade f11cd76d2f45\n",
|
||||||
"%cd -q research"
|
"%cd -q notebooks"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
|
@ -7647,7 +7647,7 @@
|
||||||
"name": "python",
|
"name": "python",
|
||||||
"nbconvert_exporter": "python",
|
"nbconvert_exporter": "python",
|
||||||
"pygments_lexer": "ipython3",
|
"pygments_lexer": "ipython3",
|
||||||
"version": "3.8.6"
|
"version": "3.8.5"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"nbformat": 4,
|
"nbformat": 4,
|
||||||
|
|
@ -114,7 +114,7 @@
|
||||||
" shutil.rmtree(r_libs_path)\n",
|
" shutil.rmtree(r_libs_path)\n",
|
||||||
"except FileNotFoundError:\n",
|
"except FileNotFoundError:\n",
|
||||||
" pass\n",
|
" pass\n",
|
||||||
"os.makedirs(r_libs_path)"
|
"os.mkdir(r_libs_path)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
File diff suppressed because it is too large
Load diff
File diff suppressed because it is too large
Load diff
File diff suppressed because it is too large
Load diff
File diff suppressed because it is too large
Load diff
File diff suppressed because it is too large
Load diff
File diff suppressed because it is too large
Load diff
File diff suppressed because it is too large
Load diff
File diff suppressed because it is too large
Load diff
File diff suppressed because it is too large
Load diff
31
setup.cfg
31
setup.cfg
|
|
@ -72,6 +72,8 @@ select =
|
||||||
ANN0, ANN2, ANN3,
|
ANN0, ANN2, ANN3,
|
||||||
# flake8-black => complain if black would make changes
|
# flake8-black => complain if black would make changes
|
||||||
BLK1, BLK9,
|
BLK1, BLK9,
|
||||||
|
# flake8-expression-complexity => not too many expressions at once
|
||||||
|
ECE001,
|
||||||
# flake8-pytest-style => enforce a consistent style with pytest
|
# flake8-pytest-style => enforce a consistent style with pytest
|
||||||
PT0,
|
PT0,
|
||||||
|
|
||||||
|
|
@ -87,8 +89,6 @@ extend-ignore =
|
||||||
# Comply with black's style.
|
# Comply with black's style.
|
||||||
# Source: https://github.com/psf/black/blob/master/docs/compatible_configs.md#flake8
|
# Source: https://github.com/psf/black/blob/master/docs/compatible_configs.md#flake8
|
||||||
E203, W503, WPS348,
|
E203, W503, WPS348,
|
||||||
# Let's not do `@pytest.mark.no_cover()` instead of `@pytest.mark.no_cover`.
|
|
||||||
PT023,
|
|
||||||
# Google's Python Style Guide is not reStructuredText
|
# Google's Python Style Guide is not reStructuredText
|
||||||
# until after being processed by Sphinx Napoleon.
|
# until after being processed by Sphinx Napoleon.
|
||||||
# Source: https://github.com/peterjc/flake8-rst-docstrings/issues/17
|
# Source: https://github.com/peterjc/flake8-rst-docstrings/issues/17
|
||||||
|
|
@ -144,9 +144,6 @@ per-file-ignores =
|
||||||
src/urban_meal_delivery/console/forecasts.py:
|
src/urban_meal_delivery/console/forecasts.py:
|
||||||
# The module is not too complex.
|
# The module is not too complex.
|
||||||
WPS232,
|
WPS232,
|
||||||
src/urban_meal_delivery/db/addresses_addresses.py:
|
|
||||||
# The module does not have too many imports.
|
|
||||||
WPS201,
|
|
||||||
src/urban_meal_delivery/db/customers.py:
|
src/urban_meal_delivery/db/customers.py:
|
||||||
# The module is not too complex.
|
# The module is not too complex.
|
||||||
WPS232,
|
WPS232,
|
||||||
|
|
@ -203,7 +200,7 @@ max-complexity = 10
|
||||||
max-local-variables = 8
|
max-local-variables = 8
|
||||||
|
|
||||||
# Allow more than wemake-python-styleguide's 7 methods per class.
|
# Allow more than wemake-python-styleguide's 7 methods per class.
|
||||||
max-methods = 15
|
max-methods = 12
|
||||||
|
|
||||||
# Comply with black's style.
|
# Comply with black's style.
|
||||||
# Source: https://github.com/psf/black/blob/master/docs/the_black_code_style.md#line-length
|
# Source: https://github.com/psf/black/blob/master/docs/the_black_code_style.md#line-length
|
||||||
|
|
@ -220,7 +217,6 @@ allowed-domain-names =
|
||||||
obj,
|
obj,
|
||||||
param,
|
param,
|
||||||
result,
|
result,
|
||||||
results,
|
|
||||||
value,
|
value,
|
||||||
max-name-length = 40
|
max-name-length = 40
|
||||||
# darglint
|
# darglint
|
||||||
|
|
@ -269,35 +265,14 @@ single_line_exclusions = typing
|
||||||
[mypy]
|
[mypy]
|
||||||
cache_dir = .cache/mypy
|
cache_dir = .cache/mypy
|
||||||
|
|
||||||
# Check the interior of functions without type annotations.
|
|
||||||
check_untyped_defs = true
|
|
||||||
|
|
||||||
# Disallow generic types without explicit type parameters.
|
|
||||||
disallow_any_generics = true
|
|
||||||
|
|
||||||
# Disallow functions with incomplete type annotations.
|
|
||||||
disallow_incomplete_defs = true
|
|
||||||
|
|
||||||
# Disallow calling functions without type annotations.
|
|
||||||
disallow_untyped_calls = true
|
|
||||||
|
|
||||||
# Disallow functions without type annotations (or incomplete annotations).
|
|
||||||
disallow_untyped_defs = true
|
|
||||||
|
|
||||||
[mypy-folium.*]
|
[mypy-folium.*]
|
||||||
ignore_missing_imports = true
|
ignore_missing_imports = true
|
||||||
[mypy-geopy.*]
|
|
||||||
ignore_missing_imports = true
|
|
||||||
[mypy-googlemaps.*]
|
|
||||||
ignore_missing_imports = true
|
|
||||||
[mypy-matplotlib.*]
|
[mypy-matplotlib.*]
|
||||||
ignore_missing_imports = true
|
ignore_missing_imports = true
|
||||||
[mypy-nox.*]
|
[mypy-nox.*]
|
||||||
ignore_missing_imports = true
|
ignore_missing_imports = true
|
||||||
[mypy-numpy.*]
|
[mypy-numpy.*]
|
||||||
ignore_missing_imports = true
|
ignore_missing_imports = true
|
||||||
[mypy-ordered_set.*]
|
|
||||||
ignore_missing_imports = true
|
|
||||||
[mypy-packaging]
|
[mypy-packaging]
|
||||||
ignore_missing_imports = true
|
ignore_missing_imports = true
|
||||||
[mypy-pandas]
|
[mypy-pandas]
|
||||||
|
|
|
||||||
|
|
@ -49,9 +49,9 @@ class Config:
|
||||||
TIME_STEPS = [60]
|
TIME_STEPS = [60]
|
||||||
|
|
||||||
# Training horizons (in full weeks) used to train the forecasting models.
|
# Training horizons (in full weeks) used to train the forecasting models.
|
||||||
# For now, we only use 7 and 8 weeks as that was the best performing in
|
# For now, we only use 8 weeks as that was the best performing in
|
||||||
# a previous study (note:4f79e8fa).
|
# a previous study (note:4f79e8fa).
|
||||||
TRAIN_HORIZONS = [7, 8]
|
TRAIN_HORIZONS = [8]
|
||||||
|
|
||||||
# The demand forecasting methods used in the simulations.
|
# The demand forecasting methods used in the simulations.
|
||||||
FORECASTING_METHODS = ['hets', 'rtarima']
|
FORECASTING_METHODS = ['hets', 'rtarima']
|
||||||
|
|
@ -59,13 +59,11 @@ class Config:
|
||||||
# Colors for the visualizations ins `folium`.
|
# Colors for the visualizations ins `folium`.
|
||||||
RESTAURANT_COLOR = 'red'
|
RESTAURANT_COLOR = 'red'
|
||||||
CUSTOMER_COLOR = 'blue'
|
CUSTOMER_COLOR = 'blue'
|
||||||
NEUTRAL_COLOR = 'black'
|
|
||||||
|
|
||||||
# Implementation-specific settings
|
# Implementation-specific settings
|
||||||
# --------------------------------
|
# --------------------------------
|
||||||
|
|
||||||
DATABASE_URI = os.getenv('DATABASE_URI')
|
DATABASE_URI = os.getenv('DATABASE_URI')
|
||||||
GOOGLE_MAPS_API_KEY = os.getenv('GOOGLE_MAPS_API_KEY')
|
|
||||||
|
|
||||||
# The PostgreSQL schema that holds the tables with the original data.
|
# The PostgreSQL schema that holds the tables with the original data.
|
||||||
ORIGINAL_SCHEMA = os.getenv('ORIGINAL_SCHEMA') or 'public'
|
ORIGINAL_SCHEMA = os.getenv('ORIGINAL_SCHEMA') or 'public'
|
||||||
|
|
@ -124,7 +122,7 @@ def make_config(env: str = 'production') -> Config:
|
||||||
# the warning is only emitted if the code is not run by pytest.
|
# the warning is only emitted if the code is not run by pytest.
|
||||||
# We see the bad configuration immediately as all "db" tests fail.
|
# We see the bad configuration immediately as all "db" tests fail.
|
||||||
if config.DATABASE_URI is None and not os.getenv('TESTING'):
|
if config.DATABASE_URI is None and not os.getenv('TESTING'):
|
||||||
warnings.warn('Bad configuration: no DATABASE_URI set in the environment')
|
warnings.warn('Bad configurartion: no DATABASE_URI set in the environment')
|
||||||
|
|
||||||
# Some functionalities require R and some packages installed.
|
# Some functionalities require R and some packages installed.
|
||||||
# To ensure isolation and reproducibility, the projects keeps the R dependencies
|
# To ensure isolation and reproducibility, the projects keeps the R dependencies
|
||||||
|
|
|
||||||
|
|
@ -9,12 +9,10 @@ from typing import Any, Callable
|
||||||
import click
|
import click
|
||||||
|
|
||||||
|
|
||||||
def db_revision(
|
def db_revision(rev: str) -> Callable: # pragma: no cover -> easy to check visually
|
||||||
rev: str,
|
|
||||||
) -> Callable[..., Callable[..., Any]]: # pragma: no cover -> easy to check visually
|
|
||||||
"""A decorator ensuring the database is at a given revision."""
|
"""A decorator ensuring the database is at a given revision."""
|
||||||
|
|
||||||
def decorator(func: Callable[..., Any]) -> Callable[..., Any]:
|
def decorator(func: Callable) -> Callable:
|
||||||
@functools.wraps(func)
|
@functools.wraps(func)
|
||||||
def ensure(*args: Any, **kwargs: Any) -> Any: # noqa:WPS430
|
def ensure(*args: Any, **kwargs: Any) -> Any: # noqa:WPS430
|
||||||
"""Do not execute the `func` if the revision does not match."""
|
"""Do not execute the `func` if the revision does not match."""
|
||||||
|
|
|
||||||
|
|
@ -103,14 +103,9 @@ def tactical_heuristic( # noqa:C901,WPS213,WPS216,WPS231
|
||||||
# Important: this check may need to be adapted once further
|
# Important: this check may need to be adapted once further
|
||||||
# commands are added the make `Forecast`s without the heuristic!
|
# commands are added the make `Forecast`s without the heuristic!
|
||||||
# Continue with forecasting on the day the last prediction was made ...
|
# Continue with forecasting on the day the last prediction was made ...
|
||||||
last_predict_at = (
|
last_predict_at = ( # noqa:ECE001
|
||||||
db.session.query(func.max(db.Forecast.start_at)) # noqa:WPS221
|
db.session.query(func.max(db.Forecast.start_at))
|
||||||
.join(db.Pixel, db.Forecast.pixel_id == db.Pixel.id)
|
|
||||||
.join(db.Grid, db.Pixel.grid_id == db.Grid.id)
|
|
||||||
.filter(db.Forecast.pixel == pixel)
|
.filter(db.Forecast.pixel == pixel)
|
||||||
.filter(db.Grid.side_length == side_length)
|
|
||||||
.filter(db.Forecast.time_step == time_step)
|
|
||||||
.filter(db.Forecast.train_horizon == train_horizon)
|
|
||||||
.first()
|
.first()
|
||||||
)[0]
|
)[0]
|
||||||
# ... or start `train_horizon` weeks after the first `Order`
|
# ... or start `train_horizon` weeks after the first `Order`
|
||||||
|
|
|
||||||
|
|
@ -34,9 +34,8 @@ def gridify() -> None: # pragma: no cover note:b1f68d24
|
||||||
|
|
||||||
click.echo(f' -> created {len(grid.pixels)} pixels')
|
click.echo(f' -> created {len(grid.pixels)} pixels')
|
||||||
|
|
||||||
# Because the number of assigned addresses is the same across
|
# The number of assigned addresses is the same across different `side_length`s.
|
||||||
# different `side_length`s, we can take any `grid` from the `city`.
|
db.session.flush() # necessary for the query to work
|
||||||
grid = db.session.query(db.Grid).filter_by(city=city).first()
|
|
||||||
n_assigned = (
|
n_assigned = (
|
||||||
db.session.query(db.AddressPixelAssociation)
|
db.session.query(db.AddressPixelAssociation)
|
||||||
.filter(db.AddressPixelAssociation.grid_id == grid.id)
|
.filter(db.AddressPixelAssociation.grid_id == grid.id)
|
||||||
|
|
|
||||||
|
|
@ -1,7 +1,6 @@
|
||||||
"""Provide the ORM models and a connection to the database."""
|
"""Provide the ORM models and a connection to the database."""
|
||||||
|
|
||||||
from urban_meal_delivery.db.addresses import Address
|
from urban_meal_delivery.db.addresses import Address
|
||||||
from urban_meal_delivery.db.addresses_addresses import Path
|
|
||||||
from urban_meal_delivery.db.addresses_pixels import AddressPixelAssociation
|
from urban_meal_delivery.db.addresses_pixels import AddressPixelAssociation
|
||||||
from urban_meal_delivery.db.cities import City
|
from urban_meal_delivery.db.cities import City
|
||||||
from urban_meal_delivery.db.connection import connection
|
from urban_meal_delivery.db.connection import connection
|
||||||
|
|
|
||||||
|
|
@ -2,7 +2,6 @@
|
||||||
|
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
import functools
|
|
||||||
from typing import Any
|
from typing import Any
|
||||||
|
|
||||||
import folium
|
import folium
|
||||||
|
|
@ -11,7 +10,6 @@ from sqlalchemy import orm
|
||||||
from sqlalchemy.dialects import postgresql
|
from sqlalchemy.dialects import postgresql
|
||||||
from sqlalchemy.ext import hybrid
|
from sqlalchemy.ext import hybrid
|
||||||
|
|
||||||
from urban_meal_delivery import config
|
|
||||||
from urban_meal_delivery.db import meta
|
from urban_meal_delivery.db import meta
|
||||||
from urban_meal_delivery.db import utils
|
from urban_meal_delivery.db import utils
|
||||||
|
|
||||||
|
|
@ -72,6 +70,9 @@ class Address(meta.Base):
|
||||||
)
|
)
|
||||||
pixels = orm.relationship('AddressPixelAssociation', back_populates='address')
|
pixels = orm.relationship('AddressPixelAssociation', back_populates='address')
|
||||||
|
|
||||||
|
# We do not implement a `.__init__()` method and leave that to SQLAlchemy.
|
||||||
|
# Instead, we use `hasattr()` to check for uninitialized attributes. grep:b1f68d24
|
||||||
|
|
||||||
def __repr__(self) -> str:
|
def __repr__(self) -> str:
|
||||||
"""Non-literal text representation."""
|
"""Non-literal text representation."""
|
||||||
return '<{cls}({street} in {city})>'.format(
|
return '<{cls}({street} in {city})>'.format(
|
||||||
|
|
@ -89,7 +90,7 @@ class Address(meta.Base):
|
||||||
"""
|
"""
|
||||||
return self.id == self.primary_id
|
return self.id == self.primary_id
|
||||||
|
|
||||||
@functools.cached_property
|
@property
|
||||||
def location(self) -> utils.Location:
|
def location(self) -> utils.Location:
|
||||||
"""The location of the address.
|
"""The location of the address.
|
||||||
|
|
||||||
|
|
@ -101,9 +102,10 @@ class Address(meta.Base):
|
||||||
Implementation detail: This property is cached as none of the
|
Implementation detail: This property is cached as none of the
|
||||||
underlying attributes to calculate the value are to be changed.
|
underlying attributes to calculate the value are to be changed.
|
||||||
"""
|
"""
|
||||||
location = utils.Location(self.latitude, self.longitude)
|
if not hasattr(self, '_location'): # noqa:WPS421 note:b1f68d24
|
||||||
location.relate_to(self.city.southwest)
|
self._location = utils.Location(self.latitude, self.longitude)
|
||||||
return location
|
self._location.relate_to(self.city.southwest)
|
||||||
|
return self._location
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def x(self) -> int: # noqa=WPS111
|
def x(self) -> int: # noqa=WPS111
|
||||||
|
|
@ -152,7 +154,7 @@ class Address(meta.Base):
|
||||||
`.city.map` for convenience in interactive usage
|
`.city.map` for convenience in interactive usage
|
||||||
"""
|
"""
|
||||||
defaults = {
|
defaults = {
|
||||||
'color': f'{config.NEUTRAL_COLOR}',
|
'color': 'black',
|
||||||
'popup': f'{self.street}, {self.zip_code} {self.city_name}',
|
'popup': f'{self.street}, {self.zip_code} {self.city_name}',
|
||||||
}
|
}
|
||||||
defaults.update(kwargs)
|
defaults.update(kwargs)
|
||||||
|
|
|
||||||
|
|
@ -1,316 +0,0 @@
|
||||||
"""Model for the `Path` relationship between two `Address` objects."""
|
|
||||||
|
|
||||||
from __future__ import annotations
|
|
||||||
|
|
||||||
import functools
|
|
||||||
import itertools
|
|
||||||
import json
|
|
||||||
from typing import List
|
|
||||||
|
|
||||||
import folium
|
|
||||||
import googlemaps as gm
|
|
||||||
import ordered_set
|
|
||||||
import sqlalchemy as sa
|
|
||||||
from geopy import distance as geo_distance
|
|
||||||
from sqlalchemy import orm
|
|
||||||
from sqlalchemy.dialects import postgresql
|
|
||||||
|
|
||||||
from urban_meal_delivery import config
|
|
||||||
from urban_meal_delivery import db
|
|
||||||
from urban_meal_delivery.db import meta
|
|
||||||
from urban_meal_delivery.db import utils
|
|
||||||
|
|
||||||
|
|
||||||
class Path(meta.Base):
|
|
||||||
"""Path between two `Address` objects.
|
|
||||||
|
|
||||||
Models the path between two `Address` objects, including directions
|
|
||||||
for a `Courier` to get from one `Address` to another.
|
|
||||||
|
|
||||||
As the couriers are on bicycles, we model the paths as
|
|
||||||
a symmetric graph (i.e., same distance in both directions).
|
|
||||||
|
|
||||||
Implements an association pattern between `Address` and `Address`.
|
|
||||||
|
|
||||||
Further info:
|
|
||||||
https://docs.sqlalchemy.org/en/stable/orm/basic_relationships.html#association-object # noqa:E501
|
|
||||||
"""
|
|
||||||
|
|
||||||
__tablename__ = 'addresses_addresses'
|
|
||||||
|
|
||||||
# Columns
|
|
||||||
first_address_id = sa.Column(sa.Integer, primary_key=True)
|
|
||||||
second_address_id = sa.Column(sa.Integer, primary_key=True)
|
|
||||||
city_id = sa.Column(sa.SmallInteger, nullable=False)
|
|
||||||
# Distances are measured in meters.
|
|
||||||
air_distance = sa.Column(sa.Integer, nullable=False)
|
|
||||||
bicycle_distance = sa.Column(sa.Integer, nullable=True)
|
|
||||||
# The duration is measured in seconds.
|
|
||||||
bicycle_duration = sa.Column(sa.Integer, nullable=True)
|
|
||||||
# An array of latitude-longitude pairs approximating a courier's way.
|
|
||||||
_directions = sa.Column('directions', postgresql.JSON, nullable=True)
|
|
||||||
|
|
||||||
# Constraints
|
|
||||||
__table_args__ = (
|
|
||||||
# The two `Address` objects must be in the same `.city`.
|
|
||||||
sa.ForeignKeyConstraint(
|
|
||||||
['first_address_id', 'city_id'],
|
|
||||||
['addresses.id', 'addresses.city_id'],
|
|
||||||
onupdate='RESTRICT',
|
|
||||||
ondelete='RESTRICT',
|
|
||||||
),
|
|
||||||
sa.ForeignKeyConstraint(
|
|
||||||
['second_address_id', 'city_id'],
|
|
||||||
['addresses.id', 'addresses.city_id'],
|
|
||||||
onupdate='RESTRICT',
|
|
||||||
ondelete='RESTRICT',
|
|
||||||
),
|
|
||||||
# Each `Address`-`Address` pair only has one distance.
|
|
||||||
sa.UniqueConstraint('first_address_id', 'second_address_id'),
|
|
||||||
sa.CheckConstraint(
|
|
||||||
'first_address_id < second_address_id',
|
|
||||||
name='distances_are_symmetric_for_bicycles',
|
|
||||||
),
|
|
||||||
sa.CheckConstraint(
|
|
||||||
'0 <= air_distance AND air_distance < 20000', name='realistic_air_distance',
|
|
||||||
),
|
|
||||||
sa.CheckConstraint(
|
|
||||||
'bicycle_distance < 25000', # `.bicycle_distance` may not be negative
|
|
||||||
name='realistic_bicycle_distance', # due to the constraint below.
|
|
||||||
),
|
|
||||||
sa.CheckConstraint(
|
|
||||||
'air_distance <= bicycle_distance', name='air_distance_is_shortest',
|
|
||||||
),
|
|
||||||
sa.CheckConstraint(
|
|
||||||
'0 <= bicycle_duration AND bicycle_duration <= 3600',
|
|
||||||
name='realistic_bicycle_travel_time',
|
|
||||||
),
|
|
||||||
)
|
|
||||||
|
|
||||||
# Relationships
|
|
||||||
first_address = orm.relationship(
|
|
||||||
'Address', foreign_keys='[Path.first_address_id, Path.city_id]',
|
|
||||||
)
|
|
||||||
second_address = orm.relationship(
|
|
||||||
'Address',
|
|
||||||
foreign_keys='[Path.second_address_id, Path.city_id]',
|
|
||||||
overlaps='first_address',
|
|
||||||
)
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def from_addresses(
|
|
||||||
cls, *addresses: db.Address, google_maps: bool = False,
|
|
||||||
) -> List[Path]:
|
|
||||||
"""Calculate pair-wise paths for `Address` objects.
|
|
||||||
|
|
||||||
This is the main constructor method for the class.
|
|
||||||
|
|
||||||
It handles the "sorting" of the `Address` objects by `.id`, which is
|
|
||||||
the logic that enforces the symmetric graph behind the paths.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
*addresses: to calculate the pair-wise paths for;
|
|
||||||
must contain at least two `Address` objects
|
|
||||||
google_maps: if `.bicycle_distance` and `._directions` should be
|
|
||||||
populated with a query to the Google Maps Directions API;
|
|
||||||
by default, only the `.air_distance` is calculated with `geopy`
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
paths
|
|
||||||
"""
|
|
||||||
paths = []
|
|
||||||
|
|
||||||
# We consider all 2-tuples of `Address`es. The symmetric graph is ...
|
|
||||||
for first, second in itertools.combinations(addresses, 2):
|
|
||||||
# ... implicitly enforced by a precedence constraint for the `.id`s.
|
|
||||||
first, second = ( # noqa:WPS211
|
|
||||||
(first, second) if first.id < second.id else (second, first)
|
|
||||||
)
|
|
||||||
|
|
||||||
# If there is no `Path` object in the database ...
|
|
||||||
path = (
|
|
||||||
db.session.query(db.Path)
|
|
||||||
.filter(db.Path.first_address == first)
|
|
||||||
.filter(db.Path.second_address == second)
|
|
||||||
.first()
|
|
||||||
)
|
|
||||||
# ... create a new one.
|
|
||||||
if path is None:
|
|
||||||
air_distance = geo_distance.great_circle(
|
|
||||||
first.location.lat_lng, second.location.lat_lng,
|
|
||||||
)
|
|
||||||
|
|
||||||
path = cls(
|
|
||||||
first_address=first,
|
|
||||||
second_address=second,
|
|
||||||
air_distance=round(air_distance.meters),
|
|
||||||
)
|
|
||||||
|
|
||||||
db.session.add(path)
|
|
||||||
db.session.commit()
|
|
||||||
|
|
||||||
paths.append(path)
|
|
||||||
|
|
||||||
if google_maps:
|
|
||||||
for path in paths: # noqa:WPS440
|
|
||||||
path.sync_with_google_maps()
|
|
||||||
|
|
||||||
return paths
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def from_order(cls, order: db.Order, google_maps: bool = False) -> Path:
|
|
||||||
"""Calculate the path for an `Order` object.
|
|
||||||
|
|
||||||
The path goes from the `Order.pickup_address` to the `Order.delivery_address`.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
order: to calculate the path for
|
|
||||||
google_maps: if `.bicycle_distance` and `._directions` should be
|
|
||||||
populated with a query to the Google Maps Directions API;
|
|
||||||
by default, only the `.air_distance` is calculated with `geopy`
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
path
|
|
||||||
"""
|
|
||||||
return cls.from_addresses(
|
|
||||||
order.pickup_address, order.delivery_address, google_maps=google_maps,
|
|
||||||
)[0]
|
|
||||||
|
|
||||||
def sync_with_google_maps(self) -> None:
|
|
||||||
"""Fill in `.bicycle_distance` and `._directions` with Google Maps.
|
|
||||||
|
|
||||||
`._directions` will NOT contain the coordinates
|
|
||||||
of `.first_address` and `.second_address`.
|
|
||||||
|
|
||||||
This uses the Google Maps Directions API.
|
|
||||||
|
|
||||||
Further info:
|
|
||||||
https://developers.google.com/maps/documentation/directions
|
|
||||||
"""
|
|
||||||
# To save costs, we do not make an API call
|
|
||||||
# if we already have data from Google Maps.
|
|
||||||
if self.bicycle_distance is not None:
|
|
||||||
return
|
|
||||||
|
|
||||||
client = gm.Client(config.GOOGLE_MAPS_API_KEY)
|
|
||||||
response = client.directions(
|
|
||||||
origin=self.first_address.location.lat_lng,
|
|
||||||
destination=self.second_address.location.lat_lng,
|
|
||||||
mode='bicycling',
|
|
||||||
alternatives=False,
|
|
||||||
)
|
|
||||||
# Without "alternatives" and "waypoints", the `response` contains
|
|
||||||
# exactly one "route" that consists of exactly one "leg".
|
|
||||||
# Source: https://developers.google.com/maps/documentation/directions/get-directions#Legs # noqa:E501
|
|
||||||
route = response[0]['legs'][0]
|
|
||||||
|
|
||||||
self.bicycle_distance = route['distance']['value'] # noqa:WPS601
|
|
||||||
self.bicycle_duration = route['duration']['value'] # noqa:WPS601
|
|
||||||
|
|
||||||
# Each route consists of many "steps" that are instructions as to how to
|
|
||||||
# get from A to B. As a step's "start_location" may equal the previous step's
|
|
||||||
# "end_location", we use an `OrderedSet` to find the unique latitude-longitude
|
|
||||||
# pairs that make up the path from `.first_address` to `.second_address`.
|
|
||||||
steps = ordered_set.OrderedSet()
|
|
||||||
for step in route['steps']:
|
|
||||||
steps.add( # noqa:WPS221
|
|
||||||
(step['start_location']['lat'], step['start_location']['lng']),
|
|
||||||
)
|
|
||||||
steps.add( # noqa:WPS221
|
|
||||||
(step['end_location']['lat'], step['end_location']['lng']),
|
|
||||||
)
|
|
||||||
|
|
||||||
steps.discard(self.first_address.location.lat_lng)
|
|
||||||
steps.discard(self.second_address.location.lat_lng)
|
|
||||||
|
|
||||||
self._directions = json.dumps(list(steps)) # noqa:WPS601
|
|
||||||
|
|
||||||
db.session.add(self)
|
|
||||||
db.session.commit()
|
|
||||||
|
|
||||||
@property # pragma: no cover
|
|
||||||
def map(self) -> folium.Map: # noqa:WPS125
|
|
||||||
"""Convenience property to obtain the underlying `City.map`."""
|
|
||||||
return self.first_address.city.map
|
|
||||||
|
|
||||||
@functools.cached_property
|
|
||||||
def waypoints(self) -> List[utils.Location]:
|
|
||||||
"""The couriers' route from `.first_address` to `.second_address`.
|
|
||||||
|
|
||||||
The returned `Location`s all relate to `.first_address.city.southwest`.
|
|
||||||
|
|
||||||
Implementation detail: This property is cached as none of the
|
|
||||||
underlying attributes (i.e., `._directions`) are to be changed.
|
|
||||||
"""
|
|
||||||
points = [utils.Location(*point) for point in json.loads(self._directions)]
|
|
||||||
for point in points:
|
|
||||||
point.relate_to(self.first_address.city.southwest)
|
|
||||||
|
|
||||||
return points
|
|
||||||
|
|
||||||
def draw( # noqa:WPS211
|
|
||||||
self,
|
|
||||||
*,
|
|
||||||
reverse: bool = False,
|
|
||||||
start_tooltip: str = 'Start',
|
|
||||||
end_tooltip: str = 'End',
|
|
||||||
start_color: str = 'green',
|
|
||||||
end_color: str = 'red',
|
|
||||||
path_color: str = 'black',
|
|
||||||
) -> folium.Map: # pragma: no cover
|
|
||||||
"""Draw the `.waypoints` from `.first_address` to `.second_address`.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
reverse: by default, `.first_address` is used as the start;
|
|
||||||
set to `False` to make `.second_address` the start
|
|
||||||
start_tooltip: text shown on marker at the path's start
|
|
||||||
end_tooltip: text shown on marker at the path's end
|
|
||||||
start_color: `folium` color for the path's start
|
|
||||||
end_color: `folium` color for the path's end
|
|
||||||
path_color: `folium` color along the path, which
|
|
||||||
is the line between the `.waypoints`
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
`.map` for convenience in interactive usage
|
|
||||||
"""
|
|
||||||
# Without `self._directions` synced from Google Maps,
|
|
||||||
# the `.waypoints` are not available.
|
|
||||||
self.sync_with_google_maps()
|
|
||||||
|
|
||||||
# First, plot the couriers' path between the start and
|
|
||||||
# end locations, so that it is below the `folium.Circle`s.
|
|
||||||
line = folium.PolyLine(
|
|
||||||
locations=(
|
|
||||||
self.first_address.location.lat_lng,
|
|
||||||
*(point.lat_lng for point in self.waypoints),
|
|
||||||
self.second_address.location.lat_lng,
|
|
||||||
),
|
|
||||||
color=path_color,
|
|
||||||
weight=2,
|
|
||||||
)
|
|
||||||
line.add_to(self.map)
|
|
||||||
|
|
||||||
# Draw the path's start and end locations, possibly reversed,
|
|
||||||
# on top of the couriers' path.
|
|
||||||
|
|
||||||
if reverse:
|
|
||||||
start, end = self.second_address, self.first_address
|
|
||||||
else:
|
|
||||||
start, end = self.first_address, self.second_address
|
|
||||||
|
|
||||||
start.draw(
|
|
||||||
radius=5,
|
|
||||||
color=start_color,
|
|
||||||
fill_color=start_color,
|
|
||||||
fill_opacity=1,
|
|
||||||
tooltip=start_tooltip,
|
|
||||||
)
|
|
||||||
end.draw(
|
|
||||||
radius=5,
|
|
||||||
color=end_color,
|
|
||||||
fill_color=end_color,
|
|
||||||
fill_opacity=1,
|
|
||||||
tooltip=end_tooltip,
|
|
||||||
)
|
|
||||||
|
|
||||||
return self.map
|
|
||||||
|
|
@ -10,7 +10,7 @@ class AddressPixelAssociation(meta.Base):
|
||||||
"""Association pattern between `Address` and `Pixel`.
|
"""Association pattern between `Address` and `Pixel`.
|
||||||
|
|
||||||
This approach is needed here mainly because it implicitly
|
This approach is needed here mainly because it implicitly
|
||||||
updates the `city_id` and `grid_id` columns.
|
updates the `_city_id` and `_grid_id` columns.
|
||||||
|
|
||||||
Further info:
|
Further info:
|
||||||
https://docs.sqlalchemy.org/en/stable/orm/basic_relationships.html#association-object # noqa:E501
|
https://docs.sqlalchemy.org/en/stable/orm/basic_relationships.html#association-object # noqa:E501
|
||||||
|
|
|
||||||
|
|
@ -2,8 +2,6 @@
|
||||||
|
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
import functools
|
|
||||||
|
|
||||||
import folium
|
import folium
|
||||||
import sqlalchemy as sa
|
import sqlalchemy as sa
|
||||||
from sqlalchemy import orm
|
from sqlalchemy import orm
|
||||||
|
|
@ -40,39 +38,51 @@ class City(meta.Base):
|
||||||
addresses = orm.relationship('Address', back_populates='city')
|
addresses = orm.relationship('Address', back_populates='city')
|
||||||
grids = orm.relationship('Grid', back_populates='city')
|
grids = orm.relationship('Grid', back_populates='city')
|
||||||
|
|
||||||
# We do not implement a `.__init__()` method and use SQLAlchemy's default.
|
# We do not implement a `.__init__()` method and leave that to SQLAlchemy.
|
||||||
# The uninitialized attribute `._map` is computed on the fly. note:d334120ei
|
# Instead, we use `hasattr()` to check for uninitialized attributes. grep:d334120e
|
||||||
|
|
||||||
def __repr__(self) -> str:
|
def __repr__(self) -> str:
|
||||||
"""Non-literal text representation."""
|
"""Non-literal text representation."""
|
||||||
return '<{cls}({name})>'.format(cls=self.__class__.__name__, name=self.name)
|
return '<{cls}({name})>'.format(cls=self.__class__.__name__, name=self.name)
|
||||||
|
|
||||||
@functools.cached_property
|
@property
|
||||||
def center(self) -> utils.Location:
|
def center(self) -> utils.Location:
|
||||||
"""Location of the city's center.
|
"""Location of the city's center.
|
||||||
|
|
||||||
Implementation detail: This property is cached as none of the
|
Implementation detail: This property is cached as none of the
|
||||||
underlying attributes to calculate the value are to be changed.
|
underlying attributes to calculate the value are to be changed.
|
||||||
"""
|
"""
|
||||||
return utils.Location(self.center_latitude, self.center_longitude)
|
if not hasattr(self, '_center'): # noqa:WPS421 note:d334120e
|
||||||
|
self._center = utils.Location(self.center_latitude, self.center_longitude)
|
||||||
|
return self._center
|
||||||
|
|
||||||
@functools.cached_property
|
@property
|
||||||
def northeast(self) -> utils.Location:
|
def northeast(self) -> utils.Location:
|
||||||
"""The city's northeast corner of the Google Maps viewport.
|
"""The city's northeast corner of the Google Maps viewport.
|
||||||
|
|
||||||
Implementation detail: This property is cached as none of the
|
Implementation detail: This property is cached as none of the
|
||||||
underlying attributes to calculate the value are to be changed.
|
underlying attributes to calculate the value are to be changed.
|
||||||
"""
|
"""
|
||||||
return utils.Location(self.northeast_latitude, self.northeast_longitude)
|
if not hasattr(self, '_northeast'): # noqa:WPS421 note:d334120e
|
||||||
|
self._northeast = utils.Location(
|
||||||
|
self.northeast_latitude, self.northeast_longitude,
|
||||||
|
)
|
||||||
|
|
||||||
@functools.cached_property
|
return self._northeast
|
||||||
|
|
||||||
|
@property
|
||||||
def southwest(self) -> utils.Location:
|
def southwest(self) -> utils.Location:
|
||||||
"""The city's southwest corner of the Google Maps viewport.
|
"""The city's southwest corner of the Google Maps viewport.
|
||||||
|
|
||||||
Implementation detail: This property is cached as none of the
|
Implementation detail: This property is cached as none of the
|
||||||
underlying attributes to calculate the value are to be changed.
|
underlying attributes to calculate the value are to be changed.
|
||||||
"""
|
"""
|
||||||
return utils.Location(self.southwest_latitude, self.southwest_longitude)
|
if not hasattr(self, '_southwest'): # noqa:WPS421 note:d334120e
|
||||||
|
self._southwest = utils.Location(
|
||||||
|
self.southwest_latitude, self.southwest_longitude,
|
||||||
|
)
|
||||||
|
|
||||||
|
return self._southwest
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def total_x(self) -> int:
|
def total_x(self) -> int:
|
||||||
|
|
@ -93,17 +103,16 @@ class City(meta.Base):
|
||||||
def clear_map(self) -> City: # pragma: no cover
|
def clear_map(self) -> City: # pragma: no cover
|
||||||
"""Create a new `folium.Map` object aligned with the city's viewport.
|
"""Create a new `folium.Map` object aligned with the city's viewport.
|
||||||
|
|
||||||
The map is available via the `.map` property. Note that it is mutable
|
The map is available via the `.map` property. Note that it is a
|
||||||
and changed from various locations in the code base.
|
mutable objects that is changed from various locations in the code base.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
self: enabling method chaining
|
self: enabling method chaining
|
||||||
""" # noqa:DAR203 note:d334120e
|
""" # noqa:DAR203
|
||||||
self._map = folium.Map(
|
self._map = folium.Map(
|
||||||
location=[self.center_latitude, self.center_longitude],
|
location=[self.center_latitude, self.center_longitude],
|
||||||
zoom_start=self.initial_zoom,
|
zoom_start=self.initial_zoom,
|
||||||
)
|
)
|
||||||
|
|
||||||
return self
|
return self
|
||||||
|
|
||||||
@property # pragma: no cover
|
@property # pragma: no cover
|
||||||
|
|
@ -129,7 +138,7 @@ class City(meta.Base):
|
||||||
`.map` for convenience in interactive usage
|
`.map` for convenience in interactive usage
|
||||||
"""
|
"""
|
||||||
# Obtain all primary `Address`es in the city that host `Restaurant`s.
|
# Obtain all primary `Address`es in the city that host `Restaurant`s.
|
||||||
addresses = (
|
addresses = ( # noqa:ECE001
|
||||||
db.session.query(db.Address)
|
db.session.query(db.Address)
|
||||||
.filter(
|
.filter(
|
||||||
db.Address.id.in_(
|
db.Address.id.in_(
|
||||||
|
|
@ -146,7 +155,7 @@ class City(meta.Base):
|
||||||
for address in addresses:
|
for address in addresses:
|
||||||
# Show the restaurant's name if there is only one.
|
# Show the restaurant's name if there is only one.
|
||||||
# Otherwise, list all the restaurants' ID's.
|
# Otherwise, list all the restaurants' ID's.
|
||||||
restaurants = (
|
restaurants = ( # noqa:ECE001
|
||||||
db.session.query(db.Restaurant)
|
db.session.query(db.Restaurant)
|
||||||
.join(db.Address, db.Restaurant.address_id == db.Address.id)
|
.join(db.Address, db.Restaurant.address_id == db.Address.id)
|
||||||
.filter(db.Address.primary_id == address.id)
|
.filter(db.Address.primary_id == address.id)
|
||||||
|
|
@ -161,7 +170,7 @@ class City(meta.Base):
|
||||||
|
|
||||||
if order_counts:
|
if order_counts:
|
||||||
# Calculate the number of orders for ALL restaurants ...
|
# Calculate the number of orders for ALL restaurants ...
|
||||||
n_orders = (
|
n_orders = ( # noqa:ECE001
|
||||||
db.session.query(db.Order.id)
|
db.session.query(db.Order.id)
|
||||||
.join(db.Address, db.Order.pickup_address_id == db.Address.id)
|
.join(db.Address, db.Order.pickup_address_id == db.Address.id)
|
||||||
.filter(db.Address.primary_id == address.id)
|
.filter(db.Address.primary_id == address.id)
|
||||||
|
|
@ -212,11 +221,11 @@ class City(meta.Base):
|
||||||
sa.text(
|
sa.text(
|
||||||
f""" -- # noqa:S608
|
f""" -- # noqa:S608
|
||||||
SELECT DISTINCT
|
SELECT DISTINCT
|
||||||
{config.CLEAN_SCHEMA}.addresses.zip_code
|
zip_code
|
||||||
FROM
|
FROM
|
||||||
{config.CLEAN_SCHEMA}.addresses AS addresses
|
{config.CLEAN_SCHEMA}.addresses
|
||||||
WHERE
|
WHERE
|
||||||
{config.CLEAN_SCHEMA}.addresses.city_id = {self.id};
|
city_id = {self.id};
|
||||||
""",
|
""",
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
|
|
|
||||||
|
|
@ -63,18 +63,15 @@ class Customer(meta.Base):
|
||||||
|
|
||||||
# Obtain all primary `Address`es where
|
# Obtain all primary `Address`es where
|
||||||
# at least one delivery was made to `self`.
|
# at least one delivery was made to `self`.
|
||||||
delivery_addresses = (
|
delivery_addresses = ( # noqa:ECE001
|
||||||
db.session.query(db.Address)
|
db.session.query(db.Address)
|
||||||
.filter(
|
.filter(
|
||||||
db.Address.id.in_(
|
db.Address.id.in_(
|
||||||
row.primary_id
|
|
||||||
for row in (
|
|
||||||
db.session.query(db.Address.primary_id) # noqa:WPS221
|
db.session.query(db.Address.primary_id) # noqa:WPS221
|
||||||
.join(db.Order, db.Address.id == db.Order.delivery_address_id)
|
.join(db.Order, db.Address.id == db.Order.delivery_address_id)
|
||||||
.filter(db.Order.customer_id == self.id)
|
.filter(db.Order.customer_id == self.id)
|
||||||
.distinct()
|
.distinct()
|
||||||
.all()
|
.all(),
|
||||||
)
|
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
.all()
|
.all()
|
||||||
|
|
@ -82,7 +79,7 @@ class Customer(meta.Base):
|
||||||
|
|
||||||
for address in delivery_addresses:
|
for address in delivery_addresses:
|
||||||
if order_counts:
|
if order_counts:
|
||||||
n_orders = (
|
n_orders = ( # noqa:ECE001
|
||||||
db.session.query(db.Order)
|
db.session.query(db.Order)
|
||||||
.join(db.Address, db.Order.delivery_address_id == db.Address.id)
|
.join(db.Address, db.Order.delivery_address_id == db.Address.id)
|
||||||
.filter(db.Order.customer_id == self.id)
|
.filter(db.Order.customer_id == self.id)
|
||||||
|
|
@ -114,7 +111,7 @@ class Customer(meta.Base):
|
||||||
)
|
)
|
||||||
|
|
||||||
if restaurants:
|
if restaurants:
|
||||||
pickup_addresses = (
|
pickup_addresses = ( # noqa:ECE001
|
||||||
db.session.query(db.Address)
|
db.session.query(db.Address)
|
||||||
.filter(
|
.filter(
|
||||||
db.Address.id.in_(
|
db.Address.id.in_(
|
||||||
|
|
@ -132,7 +129,7 @@ class Customer(meta.Base):
|
||||||
# Show the restaurant's name if there is only one.
|
# Show the restaurant's name if there is only one.
|
||||||
# Otherwise, list all the restaurants' ID's.
|
# Otherwise, list all the restaurants' ID's.
|
||||||
# We cannot show the `Order.restaurant.name` due to the aggregation.
|
# We cannot show the `Order.restaurant.name` due to the aggregation.
|
||||||
restaurants = (
|
restaurants = ( # noqa:ECE001
|
||||||
db.session.query(db.Restaurant)
|
db.session.query(db.Restaurant)
|
||||||
.join(db.Address, db.Restaurant.address_id == db.Address.id)
|
.join(db.Address, db.Restaurant.address_id == db.Address.id)
|
||||||
.filter(db.Address.primary_id == address.id) # noqa:WPS441
|
.filter(db.Address.primary_id == address.id) # noqa:WPS441
|
||||||
|
|
@ -148,7 +145,7 @@ class Customer(meta.Base):
|
||||||
)
|
)
|
||||||
|
|
||||||
if order_counts:
|
if order_counts:
|
||||||
n_orders = (
|
n_orders = ( # noqa:ECE001
|
||||||
db.session.query(db.Order)
|
db.session.query(db.Order)
|
||||||
.join(db.Address, db.Order.pickup_address_id == db.Address.id)
|
.join(db.Address, db.Order.pickup_address_id == db.Address.id)
|
||||||
.filter(db.Order.customer_id == self.id)
|
.filter(db.Order.customer_id == self.id)
|
||||||
|
|
|
||||||
|
|
@ -31,7 +31,7 @@ class Forecast(meta.Base):
|
||||||
model = sa.Column(sa.Unicode(length=20), nullable=False)
|
model = sa.Column(sa.Unicode(length=20), nullable=False)
|
||||||
# We also store the actual order counts for convenient retrieval.
|
# We also store the actual order counts for convenient retrieval.
|
||||||
# A `UniqueConstraint` below ensures that redundant values that
|
# A `UniqueConstraint` below ensures that redundant values that
|
||||||
# are to be expected are consistent across rows.
|
# are to be expected are consistend across rows.
|
||||||
actual = sa.Column(sa.SmallInteger, nullable=False)
|
actual = sa.Column(sa.SmallInteger, nullable=False)
|
||||||
# Raw `.prediction`s are stored as `float`s (possibly negative).
|
# Raw `.prediction`s are stored as `float`s (possibly negative).
|
||||||
# The rounding is then done on the fly if required.
|
# The rounding is then done on the fly if required.
|
||||||
|
|
@ -157,7 +157,7 @@ class Forecast(meta.Base):
|
||||||
Background: The functions in `urban_meal_delivery.forecasts.methods`
|
Background: The functions in `urban_meal_delivery.forecasts.methods`
|
||||||
return `pd.Dataframe`s with "start_at" (i.e., `pd.Timestamp` objects)
|
return `pd.Dataframe`s with "start_at" (i.e., `pd.Timestamp` objects)
|
||||||
values in the index and five columns "prediction", "low80", "high80",
|
values in the index and five columns "prediction", "low80", "high80",
|
||||||
"low95", and "high95" with `np.float` values. The `*Model.predict()`
|
"low95", and "high95" with `np.float` values. The `*Model.predic()`
|
||||||
methods in `urban_meal_delivery.forecasts.models` then add an "actual"
|
methods in `urban_meal_delivery.forecasts.models` then add an "actual"
|
||||||
column. This constructor converts these results into ORM models.
|
column. This constructor converts these results into ORM models.
|
||||||
Also, the `np.float` values are cast as plain `float` ones as
|
Also, the `np.float` values are cast as plain `float` ones as
|
||||||
|
|
|
||||||
|
|
@ -76,7 +76,7 @@ class Grid(meta.Base):
|
||||||
# `Pixel`s grouped by `.n_x`-`.n_y` coordinates.
|
# `Pixel`s grouped by `.n_x`-`.n_y` coordinates.
|
||||||
pixels = {}
|
pixels = {}
|
||||||
|
|
||||||
pickup_addresses = (
|
pickup_addresses = ( # noqa:ECE:001
|
||||||
db.session.query(db.Address)
|
db.session.query(db.Address)
|
||||||
.join(db.Order, db.Address.id == db.Order.pickup_address_id)
|
.join(db.Order, db.Address.id == db.Order.pickup_address_id)
|
||||||
.filter(db.Address.city == city)
|
.filter(db.Address.city == city)
|
||||||
|
|
|
||||||
|
|
@ -2,13 +2,10 @@
|
||||||
|
|
||||||
import datetime
|
import datetime
|
||||||
|
|
||||||
import folium
|
|
||||||
import sqlalchemy as sa
|
import sqlalchemy as sa
|
||||||
from sqlalchemy import orm
|
from sqlalchemy import orm
|
||||||
from sqlalchemy.dialects import postgresql
|
from sqlalchemy.dialects import postgresql
|
||||||
|
|
||||||
from urban_meal_delivery import config
|
|
||||||
from urban_meal_delivery import db
|
|
||||||
from urban_meal_delivery.db import meta
|
from urban_meal_delivery.db import meta
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -527,36 +524,3 @@ class Order(meta.Base): # noqa:WPS214
|
||||||
return '<{cls}(#{order_id})>'.format(
|
return '<{cls}(#{order_id})>'.format(
|
||||||
cls=self.__class__.__name__, order_id=self.id,
|
cls=self.__class__.__name__, order_id=self.id,
|
||||||
)
|
)
|
||||||
|
|
||||||
def draw(self) -> folium.Map: # pragma: no cover
|
|
||||||
"""Draw the `.waypoints` from `.pickup_address` to `.delivery_address`.
|
|
||||||
|
|
||||||
Important: Do not put this in an automated script as a method call
|
|
||||||
triggers an API call to the Google Maps API and may result in costs.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
`...city.map` for convenience in interactive usage
|
|
||||||
"""
|
|
||||||
path = db.Path.from_order(self)
|
|
||||||
|
|
||||||
restaurant_tooltip = f'{self.restaurant.name} (#{self.restaurant.id})'
|
|
||||||
customer_tooltip = f'Customer #{self.customer.id}'
|
|
||||||
|
|
||||||
# Because the underlying distance matrix is symmetric (i.e., a DB constraint),
|
|
||||||
# we must check if the `.pickup_address` is the couriers' `Path`'s start.
|
|
||||||
if path.first_address is self.pickup_address:
|
|
||||||
reverse = False
|
|
||||||
start_tooltip, end_tooltip = restaurant_tooltip, customer_tooltip
|
|
||||||
else:
|
|
||||||
reverse = True
|
|
||||||
start_tooltip, end_tooltip = customer_tooltip, restaurant_tooltip
|
|
||||||
|
|
||||||
# This triggers `Path.sync_with_google_maps()` behind the scenes.
|
|
||||||
return path.draw(
|
|
||||||
reverse=reverse,
|
|
||||||
start_tooltip=start_tooltip,
|
|
||||||
end_tooltip=end_tooltip,
|
|
||||||
start_color=config.RESTAURANT_COLOR,
|
|
||||||
end_color=config.CUSTOMER_COLOR,
|
|
||||||
path_color=config.NEUTRAL_COLOR,
|
|
||||||
)
|
|
||||||
|
|
|
||||||
|
|
@ -2,7 +2,6 @@
|
||||||
|
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
import functools
|
|
||||||
from typing import List
|
from typing import List
|
||||||
|
|
||||||
import folium
|
import folium
|
||||||
|
|
@ -69,50 +68,56 @@ class Pixel(meta.Base):
|
||||||
"""The area of a pixel in square kilometers."""
|
"""The area of a pixel in square kilometers."""
|
||||||
return self.grid.pixel_area
|
return self.grid.pixel_area
|
||||||
|
|
||||||
@functools.cached_property
|
@property
|
||||||
def northeast(self) -> utils.Location:
|
def northeast(self) -> utils.Location:
|
||||||
"""The pixel's northeast corner, relative to `.grid.city.southwest`.
|
"""The pixel's northeast corner, relative to `.grid.city.southwest`.
|
||||||
|
|
||||||
Implementation detail: This property is cached as none of the
|
Implementation detail: This property is cached as none of the
|
||||||
underlying attributes to calculate the value are to be changed.
|
underlying attributes to calculate the value are to be changed.
|
||||||
"""
|
"""
|
||||||
easting, northing = (
|
if not hasattr(self, '_northeast'): # noqa:WPS421 note:d334120e
|
||||||
self.grid.city.southwest.easting + ((self.n_x + 1) * self.side_length),
|
# The origin is the southwest corner of the `.grid.city`'s viewport.
|
||||||
self.grid.city.southwest.northing + ((self.n_y + 1) * self.side_length),
|
easting_origin = self.grid.city.southwest.easting
|
||||||
)
|
northing_origin = self.grid.city.southwest.northing
|
||||||
latitude, longitude = utm.to_latlon(
|
|
||||||
easting, northing, *self.grid.city.southwest.zone_details,
|
|
||||||
)
|
|
||||||
|
|
||||||
location = utils.Location(latitude, longitude)
|
# `+1` as otherwise we get the pixel's `.southwest` corner.
|
||||||
location.relate_to(self.grid.city.southwest)
|
easting = easting_origin + ((self.n_x + 1) * self.side_length)
|
||||||
|
northing = northing_origin + ((self.n_y + 1) * self.side_length)
|
||||||
|
zone, band = self.grid.city.southwest.zone_details
|
||||||
|
latitude, longitude = utm.to_latlon(easting, northing, zone, band)
|
||||||
|
|
||||||
return location
|
self._northeast = utils.Location(latitude, longitude)
|
||||||
|
self._northeast.relate_to(self.grid.city.southwest)
|
||||||
|
|
||||||
@functools.cached_property
|
return self._northeast
|
||||||
|
|
||||||
|
@property
|
||||||
def southwest(self) -> utils.Location:
|
def southwest(self) -> utils.Location:
|
||||||
"""The pixel's southwest corner, relative to `.grid.city.southwest`.
|
"""The pixel's northeast corner, relative to `.grid.city.southwest`.
|
||||||
|
|
||||||
Implementation detail: This property is cached as none of the
|
Implementation detail: This property is cached as none of the
|
||||||
underlying attributes to calculate the value are to be changed.
|
underlying attributes to calculate the value are to be changed.
|
||||||
"""
|
"""
|
||||||
easting, northing = (
|
if not hasattr(self, '_southwest'): # noqa:WPS421 note:d334120e
|
||||||
self.grid.city.southwest.easting + (self.n_x * self.side_length),
|
# The origin is the southwest corner of the `.grid.city`'s viewport.
|
||||||
self.grid.city.southwest.northing + (self.n_y * self.side_length),
|
easting_origin = self.grid.city.southwest.easting
|
||||||
)
|
northing_origin = self.grid.city.southwest.northing
|
||||||
latitude, longitude = utm.to_latlon(
|
|
||||||
easting, northing, *self.grid.city.southwest.zone_details,
|
|
||||||
)
|
|
||||||
|
|
||||||
location = utils.Location(latitude, longitude)
|
easting = easting_origin + (self.n_x * self.side_length)
|
||||||
location.relate_to(self.grid.city.southwest)
|
northing = northing_origin + (self.n_y * self.side_length)
|
||||||
|
zone, band = self.grid.city.southwest.zone_details
|
||||||
|
latitude, longitude = utm.to_latlon(easting, northing, zone, band)
|
||||||
|
|
||||||
return location
|
self._southwest = utils.Location(latitude, longitude)
|
||||||
|
self._southwest.relate_to(self.grid.city.southwest)
|
||||||
|
|
||||||
@functools.cached_property
|
return self._southwest
|
||||||
|
|
||||||
|
@property
|
||||||
def restaurants(self) -> List[db.Restaurant]: # pragma: no cover
|
def restaurants(self) -> List[db.Restaurant]: # pragma: no cover
|
||||||
"""Obtain all `Restaurant`s in `self`."""
|
"""Obtain all `Restaurant`s in `self`."""
|
||||||
return (
|
if not hasattr(self, '_restaurants'): # noqa:WPS421 note:d334120e
|
||||||
|
self._restaurants = ( # noqa:ECE001
|
||||||
db.session.query(db.Restaurant)
|
db.session.query(db.Restaurant)
|
||||||
.join(
|
.join(
|
||||||
db.AddressPixelAssociation,
|
db.AddressPixelAssociation,
|
||||||
|
|
@ -122,6 +127,8 @@ class Pixel(meta.Base):
|
||||||
.all()
|
.all()
|
||||||
)
|
)
|
||||||
|
|
||||||
|
return self._restaurants
|
||||||
|
|
||||||
def clear_map(self) -> Pixel: # pragma: no cover
|
def clear_map(self) -> Pixel: # pragma: no cover
|
||||||
"""Shortcut to the `.city.clear_map()` method.
|
"""Shortcut to the `.city.clear_map()` method.
|
||||||
|
|
||||||
|
|
@ -175,7 +182,7 @@ class Pixel(meta.Base):
|
||||||
if restaurants:
|
if restaurants:
|
||||||
# Obtain all primary `Address`es in the city that host `Restaurant`s
|
# Obtain all primary `Address`es in the city that host `Restaurant`s
|
||||||
# and are in the `self` `Pixel`.
|
# and are in the `self` `Pixel`.
|
||||||
addresses = (
|
addresses = ( # noqa:ECE001
|
||||||
db.session.query(db.Address)
|
db.session.query(db.Address)
|
||||||
.filter(
|
.filter(
|
||||||
db.Address.id.in_(
|
db.Address.id.in_(
|
||||||
|
|
@ -201,7 +208,7 @@ class Pixel(meta.Base):
|
||||||
for address in addresses:
|
for address in addresses:
|
||||||
# Show the restaurant's name if there is only one.
|
# Show the restaurant's name if there is only one.
|
||||||
# Otherwise, list all the restaurants' ID's.
|
# Otherwise, list all the restaurants' ID's.
|
||||||
restaurants = (
|
restaurants = ( # noqa:ECE001
|
||||||
db.session.query(db.Restaurant)
|
db.session.query(db.Restaurant)
|
||||||
.join(db.Address, db.Restaurant.address_id == db.Address.id)
|
.join(db.Address, db.Restaurant.address_id == db.Address.id)
|
||||||
.filter(db.Address.primary_id == address.id)
|
.filter(db.Address.primary_id == address.id)
|
||||||
|
|
@ -218,7 +225,7 @@ class Pixel(meta.Base):
|
||||||
|
|
||||||
if order_counts:
|
if order_counts:
|
||||||
# Calculate the number of orders for ALL restaurants ...
|
# Calculate the number of orders for ALL restaurants ...
|
||||||
n_orders = (
|
n_orders = ( # noqa:ECE001
|
||||||
db.session.query(db.Order.id)
|
db.session.query(db.Order.id)
|
||||||
.join(db.Address, db.Order.pickup_address_id == db.Address.id)
|
.join(db.Address, db.Order.pickup_address_id == db.Address.id)
|
||||||
.filter(db.Address.primary_id == address.id)
|
.filter(db.Address.primary_id == address.id)
|
||||||
|
|
|
||||||
|
|
@ -45,11 +45,7 @@ class Restaurant(meta.Base):
|
||||||
|
|
||||||
# Relationships
|
# Relationships
|
||||||
address = orm.relationship('Address', back_populates='restaurants')
|
address = orm.relationship('Address', back_populates='restaurants')
|
||||||
orders = orm.relationship(
|
orders = orm.relationship('Order', back_populates='restaurant')
|
||||||
'Order',
|
|
||||||
back_populates='restaurant',
|
|
||||||
overlaps='orders_picked_up,pickup_address',
|
|
||||||
)
|
|
||||||
|
|
||||||
def __repr__(self) -> str:
|
def __repr__(self) -> str:
|
||||||
"""Non-literal text representation."""
|
"""Non-literal text representation."""
|
||||||
|
|
@ -87,20 +83,15 @@ class Restaurant(meta.Base):
|
||||||
if customers:
|
if customers:
|
||||||
# Obtain all primary `Address`es in the city that
|
# Obtain all primary `Address`es in the city that
|
||||||
# received at least one delivery from `self`.
|
# received at least one delivery from `self`.
|
||||||
delivery_addresses = (
|
delivery_addresses = ( # noqa:ECE001
|
||||||
db.session.query(db.Address)
|
db.session.query(db.Address)
|
||||||
.filter(
|
.filter(
|
||||||
db.Address.id.in_(
|
db.Address.id.in_(
|
||||||
row.primary_id
|
|
||||||
for row in (
|
|
||||||
db.session.query(db.Address.primary_id) # noqa:WPS221
|
db.session.query(db.Address.primary_id) # noqa:WPS221
|
||||||
.join(
|
.join(db.Order, db.Address.id == db.Order.delivery_address_id)
|
||||||
db.Order, db.Address.id == db.Order.delivery_address_id,
|
|
||||||
)
|
|
||||||
.filter(db.Order.restaurant_id == self.id)
|
.filter(db.Order.restaurant_id == self.id)
|
||||||
.distinct()
|
.distinct()
|
||||||
.all()
|
.all(),
|
||||||
)
|
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
.all()
|
.all()
|
||||||
|
|
@ -108,7 +99,7 @@ class Restaurant(meta.Base):
|
||||||
|
|
||||||
for address in delivery_addresses:
|
for address in delivery_addresses:
|
||||||
if order_counts:
|
if order_counts:
|
||||||
n_orders = (
|
n_orders = ( # noqa:ECE001
|
||||||
db.session.query(db.Order)
|
db.session.query(db.Order)
|
||||||
.join(db.Address, db.Order.delivery_address_id == db.Address.id)
|
.join(db.Address, db.Order.delivery_address_id == db.Address.id)
|
||||||
.filter(db.Order.restaurant_id == self.id)
|
.filter(db.Order.restaurant_id == self.id)
|
||||||
|
|
|
||||||
|
|
@ -7,7 +7,7 @@ from typing import Optional, Tuple
|
||||||
import utm
|
import utm
|
||||||
|
|
||||||
|
|
||||||
class Location: # noqa:WPS214
|
class Location:
|
||||||
"""A location represented in WGS84 and UTM coordinates.
|
"""A location represented in WGS84 and UTM coordinates.
|
||||||
|
|
||||||
WGS84:
|
WGS84:
|
||||||
|
|
@ -15,7 +15,7 @@ class Location: # noqa:WPS214
|
||||||
- assumes earth is a sphere and models the location in 3D
|
- assumes earth is a sphere and models the location in 3D
|
||||||
|
|
||||||
UTM:
|
UTM:
|
||||||
- the Universal Transverse Mercator system
|
- the Universal Transverse Mercator sytem
|
||||||
- projects WGS84 coordinates onto a 2D map
|
- projects WGS84 coordinates onto a 2D map
|
||||||
- can be used for visualizations and calculations directly
|
- can be used for visualizations and calculations directly
|
||||||
- distances are in meters
|
- distances are in meters
|
||||||
|
|
@ -67,11 +67,6 @@ class Location: # noqa:WPS214
|
||||||
"""
|
"""
|
||||||
return self._longitude
|
return self._longitude
|
||||||
|
|
||||||
@property
|
|
||||||
def lat_lng(self) -> Tuple[float, float]:
|
|
||||||
"""The `.latitude` and `.longitude` as a 2-`tuple`."""
|
|
||||||
return self._latitude, self._longitude
|
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def easting(self) -> int:
|
def easting(self) -> int:
|
||||||
"""The easting of the location in meters (UTM)."""
|
"""The easting of the location in meters (UTM)."""
|
||||||
|
|
@ -90,7 +85,7 @@ class Location: # noqa:WPS214
|
||||||
@property
|
@property
|
||||||
def zone_details(self) -> Tuple[int, str]:
|
def zone_details(self) -> Tuple[int, str]:
|
||||||
"""The UTM zone of the location as the zone number and the band."""
|
"""The UTM zone of the location as the zone number and the band."""
|
||||||
return self._zone, self._band
|
return (self._zone, self._band)
|
||||||
|
|
||||||
def __eq__(self, other: object) -> bool:
|
def __eq__(self, other: object) -> bool:
|
||||||
"""Check if two `Location` objects are the same location."""
|
"""Check if two `Location` objects are the same location."""
|
||||||
|
|
|
||||||
|
|
@ -31,8 +31,8 @@ def predict(
|
||||||
Raises:
|
Raises:
|
||||||
ValueError: if `training_ts` contains `NaN` values
|
ValueError: if `training_ts` contains `NaN` values
|
||||||
"""
|
"""
|
||||||
# Initialize R only if it is actually used.
|
# Initialize R only if necessary as it is tested only in nox's
|
||||||
# For example, the nox session "ci-tests-fast" does not use it.
|
# "ci-tests-slow" session and "ci-tests-fast" should not fail.
|
||||||
from urban_meal_delivery import init_r # noqa:F401,WPS433
|
from urban_meal_delivery import init_r # noqa:F401,WPS433
|
||||||
|
|
||||||
# Re-seed R every time it is used to ensure reproducibility.
|
# Re-seed R every time it is used to ensure reproducibility.
|
||||||
|
|
|
||||||
|
|
@ -154,8 +154,8 @@ def stl( # noqa:C901,WPS210,WPS211,WPS231
|
||||||
else:
|
else:
|
||||||
robust = False
|
robust = False
|
||||||
|
|
||||||
# Initialize R only if it is actually used.
|
# Initialize R only if necessary as it is tested only in nox's
|
||||||
# For example, the nox session "ci-tests-fast" does not use it.
|
# "ci-tests-slow" session and "ci-tests-fast" should not fail.
|
||||||
from urban_meal_delivery import init_r # noqa:F401,WPS433
|
from urban_meal_delivery import init_r # noqa:F401,WPS433
|
||||||
|
|
||||||
# Re-seed R every time it is used to ensure reproducibility.
|
# Re-seed R every time it is used to ensure reproducibility.
|
||||||
|
|
|
||||||
|
|
@ -32,8 +32,8 @@ def predict(
|
||||||
Raises:
|
Raises:
|
||||||
ValueError: if `training_ts` contains `NaN` values
|
ValueError: if `training_ts` contains `NaN` values
|
||||||
"""
|
"""
|
||||||
# Initialize R only if it is actually used.
|
# Initialize R only if necessary as it is tested only in nox's
|
||||||
# For example, the nox session "ci-tests-fast" does not use it.
|
# "ci-tests-slow" session and "ci-tests-fast" should not fail.
|
||||||
from urban_meal_delivery import init_r # noqa:F401,WPS433
|
from urban_meal_delivery import init_r # noqa:F401,WPS433
|
||||||
|
|
||||||
# Re-seed R every time it is used to ensure reproducibility.
|
# Re-seed R every time it is used to ensure reproducibility.
|
||||||
|
|
|
||||||
|
|
@ -15,7 +15,7 @@ For the paper check:
|
||||||
|
|
||||||
This sub-package is organized as follows. The `base` module defines an abstract
|
This sub-package is organized as follows. The `base` module defines an abstract
|
||||||
`ForecastingModelABC` class that unifies how the concrete `*Model`s work.
|
`ForecastingModelABC` class that unifies how the concrete `*Model`s work.
|
||||||
While the abstract `.predict()` method returns a `pd.DataFrame` (= basically,
|
While the abstact `.predict()` method returns a `pd.DataFrame` (= basically,
|
||||||
the result of one of the forecasting `methods`, the concrete `.make_forecast()`
|
the result of one of the forecasting `methods`, the concrete `.make_forecast()`
|
||||||
method converts the results into `Forecast` (=ORM) objects.
|
method converts the results into `Forecast` (=ORM) objects.
|
||||||
Also, `.make_forecast()` implements a caching strategy where already made
|
Also, `.make_forecast()` implements a caching strategy where already made
|
||||||
|
|
@ -23,7 +23,7 @@ Also, `.make_forecast()` implements a caching strategy where already made
|
||||||
which could be a heavier computation.
|
which could be a heavier computation.
|
||||||
|
|
||||||
The `tactical` sub-package contains all the `*Model`s used to implement the
|
The `tactical` sub-package contains all the `*Model`s used to implement the
|
||||||
predictive routing strategy employed by the UDP.
|
UDP's predictive routing strategy.
|
||||||
|
|
||||||
A future `planning` sub-package will contain the `*Model`s used to plan the
|
A future `planning` sub-package will contain the `*Model`s used to plan the
|
||||||
`Courier`'s shifts a week ahead.
|
`Courier`'s shifts a week ahead.
|
||||||
|
|
|
||||||
|
|
@ -75,7 +75,7 @@ class ForecastingModelABC(abc.ABC):
|
||||||
# noqa:DAR401 RuntimeError
|
# noqa:DAR401 RuntimeError
|
||||||
"""
|
"""
|
||||||
if ( # noqa:WPS337
|
if ( # noqa:WPS337
|
||||||
cached_forecast := db.session.query(db.Forecast) # noqa:WPS221
|
cached_forecast := db.session.query(db.Forecast) # noqa:ECE001,WPS221
|
||||||
.filter_by(pixel=pixel)
|
.filter_by(pixel=pixel)
|
||||||
.filter_by(start_at=predict_at)
|
.filter_by(start_at=predict_at)
|
||||||
.filter_by(time_step=self._order_history.time_step)
|
.filter_by(time_step=self._order_history.time_step)
|
||||||
|
|
|
||||||
|
|
@ -1,8 +1,8 @@
|
||||||
"""Forecasting `*Model`s to predict demand for tactical purposes.
|
"""Forecasting `*Model`s to predict demand for tactical purposes.
|
||||||
|
|
||||||
The `*Model`s in this module predict only a small number (e.g., one)
|
The `*Model`s in this module predict only a small number (e.g., one)
|
||||||
of time steps into the near future and are used to implement the
|
of time steps into the near future and are used to implement the UDP's
|
||||||
predictive routing strategies employed by the UDP.
|
predictive routing strategies.
|
||||||
|
|
||||||
They are classified into "horizontal", "vertical", and "real-time" models
|
They are classified into "horizontal", "vertical", and "real-time" models
|
||||||
with respect to what historic data they are trained on and how often they
|
with respect to what historic data they are trained on and how often they
|
||||||
|
|
|
||||||
|
|
@ -51,7 +51,7 @@ class HorizontalETSModel(base.ForecastingModelABC):
|
||||||
# Make `predictions` with the seasonal ETS method ("ZZZ" model).
|
# Make `predictions` with the seasonal ETS method ("ZZZ" model).
|
||||||
predictions = methods.ets.predict(
|
predictions = methods.ets.predict(
|
||||||
training_ts=training_ts,
|
training_ts=training_ts,
|
||||||
forecast_interval=pd.DatetimeIndex(actuals_ts.index),
|
forecast_interval=actuals_ts.index,
|
||||||
frequency=frequency, # `== 7`, the number of weekdays
|
frequency=frequency, # `== 7`, the number of weekdays
|
||||||
seasonal_fit=True, # because there was no decomposition before
|
seasonal_fit=True, # because there was no decomposition before
|
||||||
)
|
)
|
||||||
|
|
@ -59,7 +59,7 @@ class HorizontalETSModel(base.ForecastingModelABC):
|
||||||
predictions.insert(loc=0, column='actual', value=actuals_ts)
|
predictions.insert(loc=0, column='actual', value=actuals_ts)
|
||||||
|
|
||||||
# Sanity checks.
|
# Sanity checks.
|
||||||
if predictions.isnull().sum().any(): # pragma: no cover
|
if predictions.isnull().any().any(): # pragma: no cover
|
||||||
raise RuntimeError('missing predictions in hets model')
|
raise RuntimeError('missing predictions in hets model')
|
||||||
if predict_at not in predictions.index: # pragma: no cover
|
if predict_at not in predictions.index: # pragma: no cover
|
||||||
raise RuntimeError('missing prediction for `predict_at`')
|
raise RuntimeError('missing prediction for `predict_at`')
|
||||||
|
|
|
||||||
|
|
@ -59,7 +59,7 @@ class RealtimeARIMAModel(base.ForecastingModelABC):
|
||||||
# Make predictions for the seasonal component by linear extrapolation.
|
# Make predictions for the seasonal component by linear extrapolation.
|
||||||
seasonal_predictions = methods.extrapolate_season.predict(
|
seasonal_predictions = methods.extrapolate_season.predict(
|
||||||
training_ts=decomposed_training_ts['seasonal'],
|
training_ts=decomposed_training_ts['seasonal'],
|
||||||
forecast_interval=pd.DatetimeIndex(actuals_ts.index),
|
forecast_interval=actuals_ts.index,
|
||||||
frequency=frequency,
|
frequency=frequency,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
@ -68,7 +68,7 @@ class RealtimeARIMAModel(base.ForecastingModelABC):
|
||||||
training_ts=(
|
training_ts=(
|
||||||
decomposed_training_ts['trend'] + decomposed_training_ts['residual']
|
decomposed_training_ts['trend'] + decomposed_training_ts['residual']
|
||||||
),
|
),
|
||||||
forecast_interval=pd.DatetimeIndex(actuals_ts.index),
|
forecast_interval=actuals_ts.index,
|
||||||
# Because the seasonality was taken out before,
|
# Because the seasonality was taken out before,
|
||||||
# the `training_ts` has, by definition, a `frequency` of `1`.
|
# the `training_ts` has, by definition, a `frequency` of `1`.
|
||||||
frequency=1,
|
frequency=1,
|
||||||
|
|
@ -109,7 +109,7 @@ class RealtimeARIMAModel(base.ForecastingModelABC):
|
||||||
# Sanity checks.
|
# Sanity checks.
|
||||||
if len(predictions) != 1: # pragma: no cover
|
if len(predictions) != 1: # pragma: no cover
|
||||||
raise RuntimeError('real-time models should predict exactly one time step')
|
raise RuntimeError('real-time models should predict exactly one time step')
|
||||||
if predictions.isnull().sum().any(): # pragma: no cover
|
if predictions.isnull().any().any(): # pragma: no cover
|
||||||
raise RuntimeError('missing predictions in rtarima model')
|
raise RuntimeError('missing predictions in rtarima model')
|
||||||
if predict_at not in predictions.index: # pragma: no cover
|
if predict_at not in predictions.index: # pragma: no cover
|
||||||
raise RuntimeError('missing prediction for `predict_at`')
|
raise RuntimeError('missing prediction for `predict_at`')
|
||||||
|
|
|
||||||
|
|
@ -61,7 +61,7 @@ class VerticalARIMAModel(base.ForecastingModelABC):
|
||||||
# Make predictions for the seasonal component by linear extrapolation.
|
# Make predictions for the seasonal component by linear extrapolation.
|
||||||
seasonal_predictions = methods.extrapolate_season.predict(
|
seasonal_predictions = methods.extrapolate_season.predict(
|
||||||
training_ts=decomposed_training_ts['seasonal'],
|
training_ts=decomposed_training_ts['seasonal'],
|
||||||
forecast_interval=pd.DatetimeIndex(actuals_ts.index),
|
forecast_interval=actuals_ts.index,
|
||||||
frequency=frequency,
|
frequency=frequency,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
@ -70,7 +70,7 @@ class VerticalARIMAModel(base.ForecastingModelABC):
|
||||||
training_ts=(
|
training_ts=(
|
||||||
decomposed_training_ts['trend'] + decomposed_training_ts['residual']
|
decomposed_training_ts['trend'] + decomposed_training_ts['residual']
|
||||||
),
|
),
|
||||||
forecast_interval=pd.DatetimeIndex(actuals_ts.index),
|
forecast_interval=actuals_ts.index,
|
||||||
# Because the seasonality was taken out before,
|
# Because the seasonality was taken out before,
|
||||||
# the `training_ts` has, by definition, a `frequency` of `1`.
|
# the `training_ts` has, by definition, a `frequency` of `1`.
|
||||||
frequency=1,
|
frequency=1,
|
||||||
|
|
@ -111,7 +111,7 @@ class VerticalARIMAModel(base.ForecastingModelABC):
|
||||||
# Sanity checks.
|
# Sanity checks.
|
||||||
if len(predictions) <= 1: # pragma: no cover
|
if len(predictions) <= 1: # pragma: no cover
|
||||||
raise RuntimeError('vertical models should predict several time steps')
|
raise RuntimeError('vertical models should predict several time steps')
|
||||||
if predictions.isnull().sum().any(): # pragma: no cover
|
if predictions.isnull().any().any(): # pragma: no cover
|
||||||
raise RuntimeError('missing predictions in varima model')
|
raise RuntimeError('missing predictions in varima model')
|
||||||
if predict_at not in predictions.index: # pragma: no cover
|
if predict_at not in predictions.index: # pragma: no cover
|
||||||
raise RuntimeError('missing prediction for `predict_at`')
|
raise RuntimeError('missing prediction for `predict_at`')
|
||||||
|
|
|
||||||
|
|
@ -84,50 +84,41 @@ class OrderHistory:
|
||||||
pixels.pixel_id,
|
pixels.pixel_id,
|
||||||
DATE_TRUNC('MINUTE', orders.placed_at)
|
DATE_TRUNC('MINUTE', orders.placed_at)
|
||||||
AS placed_at_without_seconds,
|
AS placed_at_without_seconds,
|
||||||
(
|
((
|
||||||
(
|
|
||||||
(
|
|
||||||
EXTRACT(MINUTES FROM orders.placed_at)::INTEGER
|
EXTRACT(MINUTES FROM orders.placed_at)::INTEGER
|
||||||
% {self._time_step}
|
% {self._time_step}
|
||||||
)::TEXT
|
)::TEXT || ' MINUTES')::INTERVAL
|
||||||
||
|
AS minutes_to_be_cut
|
||||||
' MINUTES'
|
|
||||||
)::INTERVAL
|
|
||||||
) AS minutes_to_be_cut
|
|
||||||
FROM (
|
FROM (
|
||||||
SELECT
|
SELECT
|
||||||
{config.CLEAN_SCHEMA}.orders.id,
|
id,
|
||||||
{config.CLEAN_SCHEMA}.orders.placed_at,
|
placed_at,
|
||||||
{config.CLEAN_SCHEMA}.orders.pickup_address_id
|
pickup_address_id
|
||||||
FROM
|
FROM
|
||||||
{config.CLEAN_SCHEMA}.orders
|
{config.CLEAN_SCHEMA}.orders
|
||||||
INNER JOIN (
|
INNER JOIN (
|
||||||
SELECT
|
SELECT
|
||||||
{config.CLEAN_SCHEMA}.addresses.id AS address_id
|
id AS address_id
|
||||||
FROM
|
FROM
|
||||||
{config.CLEAN_SCHEMA}.addresses
|
{config.CLEAN_SCHEMA}.addresses
|
||||||
WHERE
|
WHERE
|
||||||
{config.CLEAN_SCHEMA}.addresses.city_id
|
city_id = {self._grid.city.id}
|
||||||
= {self._grid.city.id}
|
|
||||||
) AS in_city
|
) AS in_city
|
||||||
ON {config.CLEAN_SCHEMA}.orders.pickup_address_id
|
ON orders.pickup_address_id = in_city.address_id
|
||||||
= in_city.address_id
|
|
||||||
WHERE
|
WHERE
|
||||||
{config.CLEAN_SCHEMA}.orders.ad_hoc IS TRUE
|
ad_hoc IS TRUE
|
||||||
) AS
|
) AS
|
||||||
orders
|
orders
|
||||||
INNER JOIN (
|
INNER JOIN (
|
||||||
SELECT
|
SELECT
|
||||||
{config.CLEAN_SCHEMA}.addresses_pixels.address_id,
|
address_id,
|
||||||
{config.CLEAN_SCHEMA}.addresses_pixels.pixel_id
|
pixel_id
|
||||||
FROM
|
FROM
|
||||||
{config.CLEAN_SCHEMA}.addresses_pixels
|
{config.CLEAN_SCHEMA}.addresses_pixels
|
||||||
WHERE
|
WHERE
|
||||||
{config.CLEAN_SCHEMA}.addresses_pixels.grid_id
|
grid_id = {self._grid.id}
|
||||||
= {self._grid.id}
|
|
||||||
AND
|
AND
|
||||||
{config.CLEAN_SCHEMA}.addresses_pixels.city_id
|
city_id = {self._grid.city.id} -- -> sanity check
|
||||||
= {self._grid.city.id} -- -> sanity check
|
|
||||||
) AS pixels
|
) AS pixels
|
||||||
ON orders.pickup_address_id = pixels.address_id
|
ON orders.pickup_address_id = pixels.address_id
|
||||||
) AS placed_at_aggregated_into_start_at
|
) AS placed_at_aggregated_into_start_at
|
||||||
|
|
@ -551,9 +542,9 @@ class OrderHistory:
|
||||||
pixel_id=pixel_id, predict_day=predict_day, train_horizon=train_horizon,
|
pixel_id=pixel_id, predict_day=predict_day, train_horizon=train_horizon,
|
||||||
)
|
)
|
||||||
|
|
||||||
# For now, we only make forecasts with 7 and 8 weeks
|
# For now, we only make forecasts with 8 weeks
|
||||||
# as the training horizon (note:4f79e8fa).
|
# as the training horizon (note:4f79e8fa).
|
||||||
if train_horizon in {7, 8}:
|
if train_horizon == 8:
|
||||||
if add >= 25: # = "high demand"
|
if add >= 25: # = "high demand"
|
||||||
return models.HorizontalETSModel(order_history=self)
|
return models.HorizontalETSModel(order_history=self)
|
||||||
elif add >= 10: # = "medium demand"
|
elif add >= 10: # = "medium demand"
|
||||||
|
|
|
||||||
|
|
@ -3,7 +3,7 @@
|
||||||
The purpose of this module is to import all the R packages that are installed
|
The purpose of this module is to import all the R packages that are installed
|
||||||
into a sub-folder (see `config.R_LIBS_PATH`) in the project's root directory.
|
into a sub-folder (see `config.R_LIBS_PATH`) in the project's root directory.
|
||||||
|
|
||||||
The Jupyter notebook "research/00_r_dependencies.ipynb" can be used to install all
|
The Jupyter notebook "research/r_dependencies.ipynb" can be used to install all
|
||||||
R dependencies on a Ubuntu/Debian based system.
|
R dependencies on a Ubuntu/Debian based system.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
@ -24,5 +24,5 @@ try: # noqa:WPS229
|
||||||
rpackages.importr('zoo')
|
rpackages.importr('zoo')
|
||||||
|
|
||||||
except rpackages.PackageNotInstalledError: # pragma: no cover
|
except rpackages.PackageNotInstalledError: # pragma: no cover
|
||||||
msg = 'See the "research/00_r_dependencies.ipynb" notebook!'
|
msg = 'See the "research/r_dependencies.ipynb" notebook!'
|
||||||
raise rpackages.PackageNotInstalledError(msg) from None
|
raise rpackages.PackageNotInstalledError(msg) from None
|
||||||
|
|
|
||||||
|
|
@ -5,7 +5,6 @@ in the CLI layer need access to the database.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import os
|
import os
|
||||||
import warnings
|
|
||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
import sqlalchemy as sa
|
import sqlalchemy as sa
|
||||||
|
|
@ -95,8 +94,6 @@ def db_session(db_connection):
|
||||||
|
|
||||||
finally:
|
finally:
|
||||||
session.close()
|
session.close()
|
||||||
|
|
||||||
with warnings.catch_warnings(record=True):
|
|
||||||
transaction.rollback()
|
transaction.rollback()
|
||||||
|
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -1,682 +0,0 @@
|
||||||
"""Test the ORM's `Path` model."""
|
|
||||||
|
|
||||||
import json
|
|
||||||
|
|
||||||
import googlemaps
|
|
||||||
import pytest
|
|
||||||
import sqlalchemy as sqla
|
|
||||||
from geopy import distance
|
|
||||||
from sqlalchemy import exc as sa_exc
|
|
||||||
|
|
||||||
from urban_meal_delivery import db
|
|
||||||
from urban_meal_delivery.db import utils
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def another_address(make_address):
|
|
||||||
"""Another `Address` object in the `city`."""
|
|
||||||
return make_address()
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def path(address, another_address, make_address):
|
|
||||||
"""A `Path` from `address` to `another_address`."""
|
|
||||||
air_distance = distance.great_circle( # noqa:WPS317
|
|
||||||
address.location.lat_lng, another_address.location.lat_lng,
|
|
||||||
).meters
|
|
||||||
|
|
||||||
# We put 5 latitude-longitude pairs as the "path" from
|
|
||||||
# `.first_address` to `.second_address`.
|
|
||||||
directions = json.dumps(
|
|
||||||
[
|
|
||||||
(float(add.latitude), float(add.longitude))
|
|
||||||
for add in (make_address() for _ in range(5)) # noqa:WPS335
|
|
||||||
],
|
|
||||||
)
|
|
||||||
|
|
||||||
return db.Path(
|
|
||||||
first_address=address,
|
|
||||||
second_address=another_address,
|
|
||||||
air_distance=round(air_distance),
|
|
||||||
bicycle_distance=round(1.25 * air_distance),
|
|
||||||
bicycle_duration=300,
|
|
||||||
_directions=directions,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class TestSpecialMethods:
|
|
||||||
"""Test special methods in `Path`."""
|
|
||||||
|
|
||||||
def test_create_an_address_address_association(self, path):
|
|
||||||
"""Test instantiation of a new `Path` object."""
|
|
||||||
assert path is not None
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.db
|
|
||||||
@pytest.mark.no_cover
|
|
||||||
class TestConstraints:
|
|
||||||
"""Test the database constraints defined in `Path`."""
|
|
||||||
|
|
||||||
def test_insert_into_database(self, db_session, path):
|
|
||||||
"""Insert an instance into the (empty) database."""
|
|
||||||
assert db_session.query(db.Path).count() == 0
|
|
||||||
|
|
||||||
db_session.add(path)
|
|
||||||
db_session.commit()
|
|
||||||
|
|
||||||
assert db_session.query(db.Path).count() == 1
|
|
||||||
|
|
||||||
def test_delete_a_referenced_first_address(self, db_session, path):
|
|
||||||
"""Remove a record that is referenced with a FK."""
|
|
||||||
db_session.add(path)
|
|
||||||
db_session.commit()
|
|
||||||
|
|
||||||
# Must delete without ORM as otherwise an UPDATE statement is emitted.
|
|
||||||
stmt = sqla.delete(db.Address).where(db.Address.id == path.first_address.id)
|
|
||||||
|
|
||||||
with pytest.raises(
|
|
||||||
sa_exc.IntegrityError,
|
|
||||||
match='fk_addresses_addresses_to_addresses_via_first_address', # shortened
|
|
||||||
):
|
|
||||||
db_session.execute(stmt)
|
|
||||||
|
|
||||||
def test_delete_a_referenced_second_address(self, db_session, path):
|
|
||||||
"""Remove a record that is referenced with a FK."""
|
|
||||||
db_session.add(path)
|
|
||||||
db_session.commit()
|
|
||||||
|
|
||||||
# Must delete without ORM as otherwise an UPDATE statement is emitted.
|
|
||||||
stmt = sqla.delete(db.Address).where(db.Address.id == path.second_address.id)
|
|
||||||
|
|
||||||
with pytest.raises(
|
|
||||||
sa_exc.IntegrityError,
|
|
||||||
match='fk_addresses_addresses_to_addresses_via_second_address', # shortened
|
|
||||||
):
|
|
||||||
db_session.execute(stmt)
|
|
||||||
|
|
||||||
def test_reference_an_invalid_city(self, db_session, address, another_address):
|
|
||||||
"""Insert a record with an invalid foreign key."""
|
|
||||||
db_session.add(address)
|
|
||||||
db_session.add(another_address)
|
|
||||||
db_session.commit()
|
|
||||||
|
|
||||||
# Must insert without ORM as otherwise SQLAlchemy figures out
|
|
||||||
# that something is wrong before any query is sent to the database.
|
|
||||||
stmt = sqla.insert(db.Path).values(
|
|
||||||
first_address_id=address.id,
|
|
||||||
second_address_id=another_address.id,
|
|
||||||
city_id=999,
|
|
||||||
air_distance=123,
|
|
||||||
)
|
|
||||||
|
|
||||||
with pytest.raises(
|
|
||||||
sa_exc.IntegrityError,
|
|
||||||
match='fk_addresses_addresses_to_addresses_via_first_address', # shortened
|
|
||||||
):
|
|
||||||
db_session.execute(stmt)
|
|
||||||
|
|
||||||
def test_redundant_addresses(self, db_session, path):
|
|
||||||
"""Insert a record that violates a unique constraint."""
|
|
||||||
db_session.add(path)
|
|
||||||
db_session.commit()
|
|
||||||
|
|
||||||
# Must insert without ORM as otherwise SQLAlchemy figures out
|
|
||||||
# that something is wrong before any query is sent to the database.
|
|
||||||
stmt = sqla.insert(db.Path).values(
|
|
||||||
first_address_id=path.first_address.id,
|
|
||||||
second_address_id=path.second_address.id,
|
|
||||||
city_id=path.city_id,
|
|
||||||
air_distance=path.air_distance,
|
|
||||||
)
|
|
||||||
|
|
||||||
with pytest.raises(sa_exc.IntegrityError, match='duplicate key value'):
|
|
||||||
db_session.execute(stmt)
|
|
||||||
|
|
||||||
def test_symmetric_addresses(self, db_session, path):
|
|
||||||
"""Insert a record that violates a check constraint."""
|
|
||||||
db_session.add(path)
|
|
||||||
db_session.commit()
|
|
||||||
|
|
||||||
another_path = db.Path(
|
|
||||||
first_address=path.second_address,
|
|
||||||
second_address=path.first_address,
|
|
||||||
air_distance=path.air_distance,
|
|
||||||
)
|
|
||||||
db_session.add(another_path)
|
|
||||||
|
|
||||||
with pytest.raises(
|
|
||||||
sa_exc.IntegrityError,
|
|
||||||
match='ck_addresses_addresses_on_distances_are_symmetric_for_bicycles',
|
|
||||||
):
|
|
||||||
db_session.commit()
|
|
||||||
|
|
||||||
def test_negative_air_distance(self, db_session, path):
|
|
||||||
"""Insert an instance with invalid data."""
|
|
||||||
path.air_distance = -1
|
|
||||||
db_session.add(path)
|
|
||||||
|
|
||||||
with pytest.raises(sa_exc.IntegrityError, match='realistic_air_distance'):
|
|
||||||
db_session.commit()
|
|
||||||
|
|
||||||
def test_air_distance_too_large(self, db_session, path):
|
|
||||||
"""Insert an instance with invalid data."""
|
|
||||||
path.air_distance = 20_000
|
|
||||||
path.bicycle_distance = 21_000
|
|
||||||
db_session.add(path)
|
|
||||||
|
|
||||||
with pytest.raises(sa_exc.IntegrityError, match='realistic_air_distance'):
|
|
||||||
db_session.commit()
|
|
||||||
|
|
||||||
def test_bicycle_distance_too_large(self, db_session, path):
|
|
||||||
"""Insert an instance with invalid data."""
|
|
||||||
path.bicycle_distance = 25_000
|
|
||||||
db_session.add(path)
|
|
||||||
|
|
||||||
with pytest.raises(sa_exc.IntegrityError, match='realistic_bicycle_distance'):
|
|
||||||
db_session.commit()
|
|
||||||
|
|
||||||
def test_air_distance_shorter_than_bicycle_distance(self, db_session, path):
|
|
||||||
"""Insert an instance with invalid data."""
|
|
||||||
path.bicycle_distance = round(0.75 * path.air_distance)
|
|
||||||
db_session.add(path)
|
|
||||||
|
|
||||||
with pytest.raises(sa_exc.IntegrityError, match='air_distance_is_shortest'):
|
|
||||||
db_session.commit()
|
|
||||||
|
|
||||||
@pytest.mark.parametrize('duration', [-1, 3601])
|
|
||||||
def test_unrealistic_bicycle_travel_time(self, db_session, path, duration):
|
|
||||||
"""Insert an instance with invalid data."""
|
|
||||||
path.bicycle_duration = duration
|
|
||||||
db_session.add(path)
|
|
||||||
|
|
||||||
with pytest.raises(
|
|
||||||
sa_exc.IntegrityError, match='realistic_bicycle_travel_time',
|
|
||||||
):
|
|
||||||
db_session.commit()
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.db
|
|
||||||
class TestFromAddresses:
|
|
||||||
"""Test the alternative constructor `Path.from_addresses()`.
|
|
||||||
|
|
||||||
Includes tests for the convenience method `Path.from_order()`,
|
|
||||||
which redirects to `Path.from_addresses()`.
|
|
||||||
"""
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def _prepare_db(self, db_session, address):
|
|
||||||
"""Put the `address` into the database.
|
|
||||||
|
|
||||||
`Address`es must be in the database as otherwise the `.city_id` column
|
|
||||||
cannot be resolved in `Path.from_addresses()`.
|
|
||||||
"""
|
|
||||||
db_session.add(address)
|
|
||||||
|
|
||||||
@pytest.mark.usefixtures('_prepare_db')
|
|
||||||
def test_make_path_instance(
|
|
||||||
self, db_session, address, another_address,
|
|
||||||
):
|
|
||||||
"""Test instantiation of a new `Path` instance."""
|
|
||||||
assert db_session.query(db.Path).count() == 0
|
|
||||||
|
|
||||||
db.Path.from_addresses(address, another_address)
|
|
||||||
|
|
||||||
assert db_session.query(db.Path).count() == 1
|
|
||||||
|
|
||||||
@pytest.mark.usefixtures('_prepare_db')
|
|
||||||
def test_make_the_same_path_instance_twice(
|
|
||||||
self, db_session, address, another_address,
|
|
||||||
):
|
|
||||||
"""Test instantiation of a new `Path` instance."""
|
|
||||||
assert db_session.query(db.Path).count() == 0
|
|
||||||
|
|
||||||
db.Path.from_addresses(address, another_address)
|
|
||||||
|
|
||||||
assert db_session.query(db.Path).count() == 1
|
|
||||||
|
|
||||||
db.Path.from_addresses(another_address, address)
|
|
||||||
|
|
||||||
assert db_session.query(db.Path).count() == 1
|
|
||||||
|
|
||||||
@pytest.mark.usefixtures('_prepare_db')
|
|
||||||
def test_structure_of_return_value(self, db_session, address, another_address):
|
|
||||||
"""Test instantiation of a new `Path` instance."""
|
|
||||||
results = db.Path.from_addresses(address, another_address)
|
|
||||||
|
|
||||||
assert isinstance(results, list)
|
|
||||||
|
|
||||||
@pytest.mark.usefixtures('_prepare_db')
|
|
||||||
def test_instances_must_have_air_distance(
|
|
||||||
self, db_session, address, another_address,
|
|
||||||
):
|
|
||||||
"""Test instantiation of a new `Path` instance."""
|
|
||||||
paths = db.Path.from_addresses(address, another_address)
|
|
||||||
|
|
||||||
result = paths[0]
|
|
||||||
|
|
||||||
assert result.air_distance is not None
|
|
||||||
|
|
||||||
@pytest.mark.usefixtures('_prepare_db')
|
|
||||||
def test_do_not_sync_instances_with_google_maps(
|
|
||||||
self, db_session, address, another_address,
|
|
||||||
):
|
|
||||||
"""Test instantiation of a new `Path` instance."""
|
|
||||||
paths = db.Path.from_addresses(address, another_address)
|
|
||||||
|
|
||||||
result = paths[0]
|
|
||||||
|
|
||||||
assert result.bicycle_distance is None
|
|
||||||
assert result.bicycle_duration is None
|
|
||||||
|
|
||||||
@pytest.mark.usefixtures('_prepare_db')
|
|
||||||
def test_sync_instances_with_google_maps(
|
|
||||||
self, db_session, address, another_address, monkeypatch,
|
|
||||||
):
|
|
||||||
"""Test instantiation of a new `Path` instance."""
|
|
||||||
|
|
||||||
def sync(self):
|
|
||||||
self.bicycle_distance = 1.25 * self.air_distance
|
|
||||||
self.bicycle_duration = 300
|
|
||||||
|
|
||||||
monkeypatch.setattr(db.Path, 'sync_with_google_maps', sync)
|
|
||||||
|
|
||||||
paths = db.Path.from_addresses(address, another_address, google_maps=True)
|
|
||||||
|
|
||||||
result = paths[0]
|
|
||||||
|
|
||||||
assert result.bicycle_distance is not None
|
|
||||||
assert result.bicycle_duration is not None
|
|
||||||
|
|
||||||
@pytest.mark.usefixtures('_prepare_db')
|
|
||||||
def test_one_path_for_two_addresses(self, db_session, address, another_address):
|
|
||||||
"""Test instantiation of a new `Path` instance."""
|
|
||||||
result = len(db.Path.from_addresses(address, another_address))
|
|
||||||
|
|
||||||
assert result == 1
|
|
||||||
|
|
||||||
@pytest.mark.usefixtures('_prepare_db')
|
|
||||||
def test_two_paths_for_three_addresses(self, db_session, make_address):
|
|
||||||
"""Test instantiation of a new `Path` instance."""
|
|
||||||
result = len(db.Path.from_addresses(*[make_address() for _ in range(3)]))
|
|
||||||
|
|
||||||
assert result == 3
|
|
||||||
|
|
||||||
@pytest.mark.usefixtures('_prepare_db')
|
|
||||||
def test_six_paths_for_four_addresses(self, db_session, make_address):
|
|
||||||
"""Test instantiation of a new `Path` instance."""
|
|
||||||
result = len(db.Path.from_addresses(*[make_address() for _ in range(4)]))
|
|
||||||
|
|
||||||
assert result == 6
|
|
||||||
|
|
||||||
# Tests for the `Path.from_order()` convenience method.
|
|
||||||
|
|
||||||
@pytest.mark.usefixtures('_prepare_db')
|
|
||||||
def test_make_path_instance_from_order(
|
|
||||||
self, db_session, order,
|
|
||||||
):
|
|
||||||
"""Test instantiation of a new `Path` instance."""
|
|
||||||
assert db_session.query(db.Path).count() == 0
|
|
||||||
|
|
||||||
db.Path.from_order(order)
|
|
||||||
|
|
||||||
assert db_session.query(db.Path).count() == 1
|
|
||||||
|
|
||||||
@pytest.mark.usefixtures('_prepare_db')
|
|
||||||
def test_make_the_same_path_instance_from_order_twice(
|
|
||||||
self, db_session, order,
|
|
||||||
):
|
|
||||||
"""Test instantiation of a new `Path` instance."""
|
|
||||||
assert db_session.query(db.Path).count() == 0
|
|
||||||
|
|
||||||
db.Path.from_order(order)
|
|
||||||
|
|
||||||
assert db_session.query(db.Path).count() == 1
|
|
||||||
|
|
||||||
db.Path.from_order(order)
|
|
||||||
|
|
||||||
assert db_session.query(db.Path).count() == 1
|
|
||||||
|
|
||||||
@pytest.mark.usefixtures('_prepare_db')
|
|
||||||
def test_structure_of_return_value_from_order(self, db_session, order):
|
|
||||||
"""Test instantiation of a new `Path` instance."""
|
|
||||||
result = db.Path.from_order(order)
|
|
||||||
|
|
||||||
assert isinstance(result, db.Path)
|
|
||||||
|
|
||||||
@pytest.mark.usefixtures('_prepare_db')
|
|
||||||
def test_sync_instance_from_order_with_google_maps(
|
|
||||||
self, db_session, order, monkeypatch,
|
|
||||||
):
|
|
||||||
"""Test instantiation of a new `Path` instance."""
|
|
||||||
|
|
||||||
def sync(self):
|
|
||||||
self.bicycle_distance = 1.25 * self.air_distance
|
|
||||||
self.bicycle_duration = 300
|
|
||||||
|
|
||||||
monkeypatch.setattr(db.Path, 'sync_with_google_maps', sync)
|
|
||||||
|
|
||||||
result = db.Path.from_order(order, google_maps=True)
|
|
||||||
|
|
||||||
assert result.bicycle_distance is not None
|
|
||||||
assert result.bicycle_duration is not None
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.db
|
|
||||||
class TestSyncWithGoogleMaps:
|
|
||||||
"""Test the `Path.sync_with_google_maps()` method."""
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def api_response(self):
|
|
||||||
"""A typical (shortened) response by the Google Maps Directions API."""
|
|
||||||
return [ # noqa:ECE001
|
|
||||||
{
|
|
||||||
'bounds': {
|
|
||||||
'northeast': {'lat': 44.8554284, 'lng': -0.5652398},
|
|
||||||
'southwest': {'lat': 44.8342256, 'lng': -0.5708206},
|
|
||||||
},
|
|
||||||
'copyrights': 'Map data ©2021',
|
|
||||||
'legs': [
|
|
||||||
{
|
|
||||||
'distance': {'text': '3.0 km', 'value': 2999},
|
|
||||||
'duration': {'text': '10 mins', 'value': 596},
|
|
||||||
'end_address': '13 Place Paul et Jean Paul Avisseau, ...',
|
|
||||||
'end_location': {'lat': 44.85540839999999, 'lng': -0.5672105},
|
|
||||||
'start_address': '59 Rue Saint-François, 33000 Bordeaux, ...',
|
|
||||||
'start_location': {'lat': 44.8342256, 'lng': -0.570372},
|
|
||||||
'steps': [
|
|
||||||
{
|
|
||||||
'distance': {'text': '0.1 km', 'value': 138},
|
|
||||||
'duration': {'text': '1 min', 'value': 43},
|
|
||||||
'end_location': {
|
|
||||||
'lat': 44.83434380000001,
|
|
||||||
'lng': -0.5690105999999999,
|
|
||||||
},
|
|
||||||
'html_instructions': 'Head <b>east</b> on <b> ...',
|
|
||||||
'polyline': {'points': '}tspGxknBKcDIkB'},
|
|
||||||
'start_location': {'lat': 44.8342256, 'lng': -0.57032},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'distance': {'text': '0.1 km', 'value': 115},
|
|
||||||
'duration': {'text': '1 min', 'value': 22},
|
|
||||||
'end_location': {'lat': 44.8353651, 'lng': -0.569199},
|
|
||||||
'html_instructions': 'Turn <b>left</b> onto <b> ...',
|
|
||||||
'maneuver': 'turn-left',
|
|
||||||
'polyline': {'points': 'suspGhcnBc@JE@_@DiAHA?w@F'},
|
|
||||||
'start_location': {
|
|
||||||
'lat': 44.83434380000001,
|
|
||||||
'lng': -0.5690105999999999,
|
|
||||||
},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'distance': {'text': '0.3 km', 'value': 268},
|
|
||||||
'duration': {'text': '1 min', 'value': 59},
|
|
||||||
'end_location': {'lat': 44.8362675, 'lng': -0.5660914},
|
|
||||||
'html_instructions': 'Turn <b>right</b> onto <b> ...',
|
|
||||||
'maneuver': 'turn-right',
|
|
||||||
'polyline': {
|
|
||||||
'points': 'a|spGndnBEYEQKi@Mi@Is@CYCOE]CQIq@ ...',
|
|
||||||
},
|
|
||||||
'start_location': {'lat': 44.8353651, 'lng': -0.56919},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'distance': {'text': '0.1 km', 'value': 95},
|
|
||||||
'duration': {'text': '1 min', 'value': 29},
|
|
||||||
'end_location': {'lat': 44.8368458, 'lng': -0.5652398},
|
|
||||||
'html_instructions': 'Slight <b>left</b> onto <b> ...',
|
|
||||||
'maneuver': 'turn-slight-left',
|
|
||||||
'polyline': {
|
|
||||||
'points': 'uatpG`qmBg@aAGM?ACE[k@CICGACEGCCAAEAG?',
|
|
||||||
},
|
|
||||||
'start_location': {
|
|
||||||
'lat': 44.8362675,
|
|
||||||
'lng': -0.5660914,
|
|
||||||
},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'distance': {'text': '23 m', 'value': 23},
|
|
||||||
'duration': {'text': '1 min', 'value': 4},
|
|
||||||
'end_location': {'lat': 44.83697, 'lng': -0.5654425},
|
|
||||||
'html_instructions': 'Slight <b>left</b> to stay ...',
|
|
||||||
'maneuver': 'turn-slight-left',
|
|
||||||
'polyline': {
|
|
||||||
'points': 'ietpGvkmBA@C?CBCBEHA@AB?B?B?B?@',
|
|
||||||
},
|
|
||||||
'start_location': {
|
|
||||||
'lat': 44.8368458,
|
|
||||||
'lng': -0.5652398,
|
|
||||||
},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'distance': {'text': '0.2 km', 'value': 185},
|
|
||||||
'duration': {'text': '1 min', 'value': 23},
|
|
||||||
'end_location': {'lat': 44.8382126, 'lng': -0.5669969},
|
|
||||||
'html_instructions': 'Take the ramp to <b>Le Lac ...',
|
|
||||||
'polyline': {
|
|
||||||
'points': 'aftpG~lmBY^[^sAdB]`@CDKLQRa@h@A@IZ',
|
|
||||||
},
|
|
||||||
'start_location': {'lat': 44.83697, 'lng': -0.5654425},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'distance': {'text': '0.3 km', 'value': 253},
|
|
||||||
'duration': {'text': '1 min', 'value': 43},
|
|
||||||
'end_location': {'lat': 44.840163, 'lng': -0.5686525},
|
|
||||||
'html_instructions': 'Merge onto <b>Quai Richelieu</b>',
|
|
||||||
'maneuver': 'merge',
|
|
||||||
'polyline': {
|
|
||||||
'points': 'ymtpGvvmBeAbAe@b@_@ZUN[To@f@e@^A?g ...',
|
|
||||||
},
|
|
||||||
'start_location': {
|
|
||||||
'lat': 44.8382126,
|
|
||||||
'lng': -0.5669969,
|
|
||||||
},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'distance': {'text': '0.1 km', 'value': 110},
|
|
||||||
'duration': {'text': '1 min', 'value': 21},
|
|
||||||
'end_location': {'lat': 44.841079, 'lng': -0.5691835},
|
|
||||||
'html_instructions': 'Continue onto <b>Quai de la ...',
|
|
||||||
'polyline': {'points': '_ztpG`anBUNQLULUJOHMFKDWN'},
|
|
||||||
'start_location': {'lat': 44.840163, 'lng': -0.56865},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'distance': {'text': '0.3 km', 'value': 262},
|
|
||||||
'duration': {'text': '1 min', 'value': 44},
|
|
||||||
'end_location': {'lat': 44.8433375, 'lng': -0.5701161},
|
|
||||||
'html_instructions': 'Continue onto <b>Quai du ...',
|
|
||||||
'polyline': {
|
|
||||||
'points': 'w_upGjdnBeBl@sBn@gA^[JIBc@Nk@Nk@L',
|
|
||||||
},
|
|
||||||
'start_location': {'lat': 44.841079, 'lng': -0.56915},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'distance': {'text': '0.6 km', 'value': 550},
|
|
||||||
'duration': {'text': '2 mins', 'value': 97},
|
|
||||||
'end_location': {
|
|
||||||
'lat': 44.84822339999999,
|
|
||||||
'lng': -0.5705307,
|
|
||||||
},
|
|
||||||
'html_instructions': 'Continue onto <b>Quai ...',
|
|
||||||
'polyline': {
|
|
||||||
'points': '{mupGfjnBYFI@IBaAPUD{AX}@NK@]Fe@H ...',
|
|
||||||
},
|
|
||||||
'start_location': {
|
|
||||||
'lat': 44.8433375,
|
|
||||||
'lng': -0.5701161,
|
|
||||||
},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'distance': {'text': '0.5 km', 'value': 508},
|
|
||||||
'duration': {'text': '1 min', 'value': 87},
|
|
||||||
'end_location': {'lat': 44.8523224, 'lng': -0.5678223},
|
|
||||||
'html_instructions': 'Continue onto ...',
|
|
||||||
'polyline': {
|
|
||||||
'points': 'klvpGxlnBWEUGWGSGMEOEOE[KMEQGIA] ...',
|
|
||||||
},
|
|
||||||
'start_location': {
|
|
||||||
'lat': 44.84822339999999,
|
|
||||||
'lng': -0.5705307,
|
|
||||||
},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'distance': {'text': '28 m', 'value': 28},
|
|
||||||
'duration': {'text': '1 min', 'value': 45},
|
|
||||||
'end_location': {
|
|
||||||
'lat': 44.85245620000001,
|
|
||||||
'lng': -0.5681259,
|
|
||||||
},
|
|
||||||
'html_instructions': 'Turn <b>left</b> onto <b> ...',
|
|
||||||
'maneuver': 'turn-left',
|
|
||||||
'polyline': {'points': '_fwpGz{mBGLADGPCFEN'},
|
|
||||||
'start_location': {
|
|
||||||
'lat': 44.8523224,
|
|
||||||
'lng': -0.5678223,
|
|
||||||
},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'distance': {'text': '0.2 km', 'value': 176},
|
|
||||||
'duration': {'text': '1 min', 'value': 31},
|
|
||||||
'end_location': {'lat': 44.8536857, 'lng': -0.5667282},
|
|
||||||
'html_instructions': 'Turn <b>right</b> onto <b> ...',
|
|
||||||
'maneuver': 'turn-right',
|
|
||||||
'polyline': {
|
|
||||||
'points': '{fwpGx}mB_@c@mAuAOQi@m@m@y@_@c@',
|
|
||||||
},
|
|
||||||
'start_location': {
|
|
||||||
'lat': 44.85245620000001,
|
|
||||||
'lng': -0.5681259,
|
|
||||||
},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'distance': {'text': '0.2 km', 'value': 172},
|
|
||||||
'duration': {'text': '1 min', 'value': 28},
|
|
||||||
'end_location': {'lat': 44.8547766, 'lng': -0.5682825},
|
|
||||||
'html_instructions': 'Turn <b>left</b> onto <b> ... ',
|
|
||||||
'maneuver': 'turn-left',
|
|
||||||
'polyline': {'points': 'qnwpG`umBW`@UkDtF'},
|
|
||||||
'start_location': {
|
|
||||||
'lat': 44.8536857,
|
|
||||||
'lng': -0.5667282,
|
|
||||||
},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'distance': {'text': '0.1 km', 'value': 101},
|
|
||||||
'duration': {'text': '1 min', 'value': 17},
|
|
||||||
'end_location': {'lat': 44.8554284, 'lng': -0.5673822},
|
|
||||||
'html_instructions': 'Turn <b>right</b> onto ...',
|
|
||||||
'maneuver': 'turn-right',
|
|
||||||
'polyline': {'points': 'kuwpGv~mBa@q@cA_B[a@'},
|
|
||||||
'start_location': {
|
|
||||||
'lat': 44.8547766,
|
|
||||||
'lng': -0.5682825,
|
|
||||||
},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
'distance': {'text': '15 m', 'value': 15},
|
|
||||||
'duration': {'text': '1 min', 'value': 3},
|
|
||||||
'end_location': {
|
|
||||||
'lat': 44.85540839999999,
|
|
||||||
'lng': -0.5672105,
|
|
||||||
},
|
|
||||||
'html_instructions': 'Turn <b>right</b> onto <b> ...',
|
|
||||||
'maneuver': 'turn-right',
|
|
||||||
'polyline': {'points': 'mywpGbymBBC@C?E?C?E?EAC'},
|
|
||||||
'start_location': {
|
|
||||||
'lat': 44.8554284,
|
|
||||||
'lng': -0.5673822,
|
|
||||||
},
|
|
||||||
'travel_mode': 'BICYCLING',
|
|
||||||
},
|
|
||||||
],
|
|
||||||
'traffic_speed_entry': [],
|
|
||||||
'via_waypoint': [],
|
|
||||||
},
|
|
||||||
],
|
|
||||||
'overview_polyline': {
|
|
||||||
'points': '}tspGxknBUoGi@LcDVe@_CW{Ba@sC[eA_@} ...',
|
|
||||||
},
|
|
||||||
'summary': 'Quai des Chartrons',
|
|
||||||
'warnings': ['Bicycling directions are in beta ...'],
|
|
||||||
'waypoint_order': [],
|
|
||||||
},
|
|
||||||
]
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def _fake_google_api(self, api_response, monkeypatch):
|
|
||||||
"""Patch out the call to the Google Maps Directions API."""
|
|
||||||
|
|
||||||
def directions(_self, *_args, **_kwargs):
|
|
||||||
return api_response
|
|
||||||
|
|
||||||
monkeypatch.setattr(googlemaps.Client, 'directions', directions)
|
|
||||||
|
|
||||||
@pytest.mark.usefixtures('_fake_google_api')
|
|
||||||
def test_sync_instances_with_google_maps(self, db_session, path):
|
|
||||||
"""Call the method for a `Path` object without Google data."""
|
|
||||||
path.bicycle_distance = None
|
|
||||||
path.bicycle_duration = None
|
|
||||||
path._directions = None
|
|
||||||
|
|
||||||
path.sync_with_google_maps()
|
|
||||||
|
|
||||||
assert path.bicycle_distance == 2_999
|
|
||||||
assert path.bicycle_duration == 596
|
|
||||||
assert path._directions is not None
|
|
||||||
|
|
||||||
@pytest.mark.usefixtures('_fake_google_api')
|
|
||||||
def test_repeated_sync_instances_with_google_maps(self, db_session, path):
|
|
||||||
"""Call the method for a `Path` object with Google data.
|
|
||||||
|
|
||||||
That call should immediately return without changing any data.
|
|
||||||
|
|
||||||
We use the `path`'s "Google" data from above.
|
|
||||||
"""
|
|
||||||
old_distance = path.bicycle_distance
|
|
||||||
old_duration = path.bicycle_duration
|
|
||||||
old_directions = path._directions
|
|
||||||
|
|
||||||
path.sync_with_google_maps()
|
|
||||||
|
|
||||||
assert path.bicycle_distance is old_distance
|
|
||||||
assert path.bicycle_duration is old_duration
|
|
||||||
assert path._directions is old_directions
|
|
||||||
|
|
||||||
|
|
||||||
class TestProperties:
|
|
||||||
"""Test properties in `Path`."""
|
|
||||||
|
|
||||||
def test_waypoints_structure(self, path):
|
|
||||||
"""Test `Path.waypoints` property."""
|
|
||||||
result = path.waypoints
|
|
||||||
|
|
||||||
assert isinstance(result, list)
|
|
||||||
assert isinstance(result[0], utils.Location)
|
|
||||||
|
|
||||||
def test_waypoints_content(self, path):
|
|
||||||
"""Test `Path.waypoints` property."""
|
|
||||||
result = path.waypoints
|
|
||||||
|
|
||||||
# There are 5 inner points, excluding start and end,
|
|
||||||
# i.e., the `.first_address` and `second_address`.
|
|
||||||
assert len(result) == 5
|
|
||||||
|
|
||||||
def test_waypoints_is_cached(self, path):
|
|
||||||
"""Test `Path.waypoints` property."""
|
|
||||||
result1 = path.waypoints
|
|
||||||
result2 = path.waypoints
|
|
||||||
|
|
||||||
assert result1 is result2
|
|
||||||
|
|
@ -24,7 +24,7 @@ def assoc(address, pixel):
|
||||||
|
|
||||||
@pytest.mark.no_cover
|
@pytest.mark.no_cover
|
||||||
class TestSpecialMethods:
|
class TestSpecialMethods:
|
||||||
"""Test special methods in `AddressPixelAssociation`."""
|
"""Test special methods in `Pixel`."""
|
||||||
|
|
||||||
def test_create_an_address_pixel_association(self, assoc):
|
def test_create_an_address_pixel_association(self, assoc):
|
||||||
"""Test instantiation of a new `AddressPixelAssociation` object."""
|
"""Test instantiation of a new `AddressPixelAssociation` object."""
|
||||||
|
|
|
||||||
|
|
@ -122,15 +122,6 @@ class TestProperties:
|
||||||
|
|
||||||
assert result == pytest.approx(float(address.longitude))
|
assert result == pytest.approx(float(address.longitude))
|
||||||
|
|
||||||
def test_lat_lng(self, location, address):
|
|
||||||
"""Test `Location.lat_lng` property."""
|
|
||||||
result = location.lat_lng
|
|
||||||
|
|
||||||
assert result == (
|
|
||||||
pytest.approx(float(address.latitude)),
|
|
||||||
pytest.approx(float(address.longitude)),
|
|
||||||
)
|
|
||||||
|
|
||||||
def test_easting(self, location):
|
def test_easting(self, location):
|
||||||
"""Test `Location.easting` property."""
|
"""Test `Location.easting` property."""
|
||||||
result = location.easting
|
result = location.easting
|
||||||
|
|
|
||||||
Loading…
Add table
Add a link
Reference in a new issue