+
+---
+
+**Documentation**: https://fastapi.tiangolo.com
+
+**Source Code**: https://github.com/tiangolo/fastapi
+
+---
+
+FastAPI is a modern, fast (high-performance), web framework for building APIs with Python 3.6+ based on standard Python type hints.
+
+The key features are:
+
+* **Fast**: Very high performance, on par with **NodeJS** and **Go** (thanks to Starlette and Pydantic). [One of the fastest Python frameworks available](#performance).
+
+* **Fast to code**: Increase the speed to develop features by about 200% to 300%. *
+* **Fewer bugs**: Reduce about 40% of human (developer) induced errors. *
+* **Intuitive**: Great editor support. Completion everywhere. Less time debugging.
+* **Easy**: Designed to be easy to use and learn. Less time reading docs.
+* **Short**: Minimize code duplication. Multiple features from each parameter declaration. Fewer bugs.
+* **Robust**: Get production-ready code. With automatic interactive documentation.
+* **Standards-based**: Based on (and fully compatible with) the open standards for APIs: OpenAPI (previously known as Swagger) and JSON Schema.
+
+* estimation based on tests on an internal development team, building production applications.
+
+## Gold Sponsors
+
+
+
+
+
+
+
+
+Other sponsors
+
+## Opinions
+
+"_[...] I'm using **FastAPI** a ton these days. [...] I'm actually planning to use it for all of my team's **ML services at Microsoft**. Some of them are getting integrated into the core **Windows** product and some **Office** products._"
+
+
+
+---
+
+"_We adopted the **FastAPI** library to spawn a **REST** server that can be queried to obtain **predictions**. [for Ludwig]_"
+
+
Piero Molino, Yaroslav Dudin, and Sai Sumanth Miryala - Uber(ref)
+
+---
+
+"_**Netflix** is pleased to announce the open-source release of our **crisis management** orchestration framework: **Dispatch**! [built with **FastAPI**]_"
+
+
Kevin Glisson, Marc Vilanova, Forest Monsen - Netflix(ref)
+
+---
+
+"_I’m over the moon excited about **FastAPI**. It’s so fun!_"
+
+
+
+---
+
+"_Honestly, what you've built looks super solid and polished. In many ways, it's what I wanted **Hug** to be - it's really inspiring to see someone build that._"
+
+
+
+---
+
+"_If you're looking to learn one **modern framework** for building REST APIs, check out **FastAPI** [...] It's fast, easy to use and easy to learn [...]_"
+
+"_We've switched over to **FastAPI** for our **APIs** [...] I think you'll like it [...]_"
+
+
+
+---
+
+## **Typer**, the FastAPI of CLIs
+
+
+
+If you are building a CLI app to be used in the terminal instead of a web API, check out **Typer**.
+
+**Typer** is FastAPI's little sibling. And it's intended to be the **FastAPI of CLIs**. ⌨️ 🚀
+
+## Requirements
+
+Python 3.6+
+
+FastAPI stands on the shoulders of giants:
+
+* Starlette for the web parts.
+* Pydantic for the data parts.
+
+## Installation
+
+
+
+## Example
+
+### Create it
+
+* Create a file `main.py` with:
+
+```Python
+from typing import Optional
+
+from fastapi import FastAPI
+
+app = FastAPI()
+
+
+@app.get("/")
+def read_root():
+ return {"Hello": "World"}
+
+
+@app.get("/items/{item_id}")
+def read_item(item_id: int, q: Optional[str] = None):
+ return {"item_id": item_id, "q": q}
+```
+
+
+Or use async def...
+
+If your code uses `async` / `await`, use `async def`:
+
+```Python hl_lines="9 14"
+from typing import Optional
+
+from fastapi import FastAPI
+
+app = FastAPI()
+
+
+@app.get("/")
+async def read_root():
+ return {"Hello": "World"}
+
+
+@app.get("/items/{item_id}")
+async def read_item(item_id: int, q: Optional[str] = None):
+ return {"item_id": item_id, "q": q}
+```
+
+**Note**:
+
+If you don't know, check the _"In a hurry?"_ section about `async` and `await` in the docs.
+
+
+
+### Run it
+
+Run the server with:
+
+
+
+```console
+$ uvicorn main:app --reload
+
+INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
+INFO: Started reloader process [28720]
+INFO: Started server process [28722]
+INFO: Waiting for application startup.
+INFO: Application startup complete.
+```
+
+
+
+
+About the command uvicorn main:app --reload...
+
+The command `uvicorn main:app` refers to:
+
+* `main`: the file `main.py` (the Python "module").
+* `app`: the object created inside of `main.py` with the line `app = FastAPI()`.
+* `--reload`: make the server restart after code changes. Only do this for development.
+
+
+
+### Check it
+
+Open your browser at http://127.0.0.1:8000/items/5?q=somequery.
+
+You will see the JSON response as:
+
+```JSON
+{"item_id": 5, "q": "somequery"}
+```
+
+You already created an API that:
+
+* Receives HTTP requests in the _paths_ `/` and `/items/{item_id}`.
+* Both _paths_ take `GET` operations (also known as HTTP _methods_).
+* The _path_ `/items/{item_id}` has a _path parameter_ `item_id` that should be an `int`.
+* The _path_ `/items/{item_id}` has an optional `str` _query parameter_ `q`.
+
+### Interactive API docs
+
+Now go to http://127.0.0.1:8000/docs.
+
+You will see the automatic interactive API documentation (provided by Swagger UI):
+
+![Swagger UI](https://fastapi.tiangolo.com/img/index/index-01-swagger-ui-simple.png)
+
+### Alternative API docs
+
+And now, go to http://127.0.0.1:8000/redoc.
+
+You will see the alternative automatic documentation (provided by ReDoc):
+
+![ReDoc](https://fastapi.tiangolo.com/img/index/index-02-redoc-simple.png)
+
+## Example upgrade
+
+Now modify the file `main.py` to receive a body from a `PUT` request.
+
+Declare the body using standard Python types, thanks to Pydantic.
+
+```Python hl_lines="4 9-12 25-27"
+from typing import Optional
+
+from fastapi import FastAPI
+from pydantic import BaseModel
+
+app = FastAPI()
+
+
+class Item(BaseModel):
+ name: str
+ price: float
+ is_offer: Optional[bool] = None
+
+
+@app.get("/")
+def read_root():
+ return {"Hello": "World"}
+
+
+@app.get("/items/{item_id}")
+def read_item(item_id: int, q: Optional[str] = None):
+ return {"item_id": item_id, "q": q}
+
+
+@app.put("/items/{item_id}")
+def update_item(item_id: int, item: Item):
+ return {"item_name": item.name, "item_id": item_id}
+```
+
+The server should reload automatically (because you added `--reload` to the `uvicorn` command above).
+
+### Interactive API docs upgrade
+
+Now go to http://127.0.0.1:8000/docs.
+
+* The interactive API documentation will be automatically updated, including the new body:
+
+![Swagger UI](https://fastapi.tiangolo.com/img/index/index-03-swagger-02.png)
+
+* Click on the button "Try it out", it allows you to fill the parameters and directly interact with the API:
+
+![Swagger UI interaction](https://fastapi.tiangolo.com/img/index/index-04-swagger-03.png)
+
+* Then click on the "Execute" button, the user interface will communicate with your API, send the parameters, get the results and show them on the screen:
+
+![Swagger UI interaction](https://fastapi.tiangolo.com/img/index/index-05-swagger-04.png)
+
+### Alternative API docs upgrade
+
+And now, go to http://127.0.0.1:8000/redoc.
+
+* The alternative documentation will also reflect the new query parameter and body:
+
+![ReDoc](https://fastapi.tiangolo.com/img/index/index-06-redoc-02.png)
+
+### Recap
+
+In summary, you declare **once** the types of parameters, body, etc. as function parameters.
+
+You do that with standard modern Python types.
+
+You don't have to learn a new syntax, the methods or classes of a specific library, etc.
+
+Just standard **Python 3.6+**.
+
+For example, for an `int`:
+
+```Python
+item_id: int
+```
+
+or for a more complex `Item` model:
+
+```Python
+item: Item
+```
+
+...and with that single declaration you get:
+
+* Editor support, including:
+ * Completion.
+ * Type checks.
+* Validation of data:
+ * Automatic and clear errors when the data is invalid.
+ * Validation even for deeply nested JSON objects.
+* Conversion of input data: coming from the network to Python data and types. Reading from:
+ * JSON.
+ * Path parameters.
+ * Query parameters.
+ * Cookies.
+ * Headers.
+ * Forms.
+ * Files.
+* Conversion of output data: converting from Python data and types to network data (as JSON):
+ * Convert Python types (`str`, `int`, `float`, `bool`, `list`, etc).
+ * `datetime` objects.
+ * `UUID` objects.
+ * Database models.
+ * ...and many more.
+* Automatic interactive API documentation, including 2 alternative user interfaces:
+ * Swagger UI.
+ * ReDoc.
+
+---
+
+Coming back to the previous code example, **FastAPI** will:
+
+* Validate that there is an `item_id` in the path for `GET` and `PUT` requests.
+* Validate that the `item_id` is of type `int` for `GET` and `PUT` requests.
+ * If it is not, the client will see a useful, clear error.
+* Check if there is an optional query parameter named `q` (as in `http://127.0.0.1:8000/items/foo?q=somequery`) for `GET` requests.
+ * As the `q` parameter is declared with `= None`, it is optional.
+ * Without the `None` it would be required (as is the body in the case with `PUT`).
+* For `PUT` requests to `/items/{item_id}`, Read the body as JSON:
+ * Check that it has a required attribute `name` that should be a `str`.
+ * Check that it has a required attribute `price` that has to be a `float`.
+ * Check that it has an optional attribute `is_offer`, that should be a `bool`, if present.
+ * All this would also work for deeply nested JSON objects.
+* Convert from and to JSON automatically.
+* Document everything with OpenAPI, that can be used by:
+ * Interactive documentation systems.
+ * Automatic client code generation systems, for many languages.
+* Provide 2 interactive documentation web interfaces directly.
+
+---
+
+We just scratched the surface, but you already get the idea of how it all works.
+
+Try changing the line with:
+
+```Python
+ return {"item_name": item.name, "item_id": item_id}
+```
+
+...from:
+
+```Python
+ ... "item_name": item.name ...
+```
+
+...to:
+
+```Python
+ ... "item_price": item.price ...
+```
+
+...and see how your editor will auto-complete the attributes and know their types:
+
+![editor support](https://fastapi.tiangolo.com/img/vscode-completion.png)
+
+For a more complete example including more features, see the Tutorial - User Guide.
+
+**Spoiler alert**: the tutorial - user guide includes:
+
+* Declaration of **parameters** from other different places as: **headers**, **cookies**, **form fields** and **files**.
+* How to set **validation constraints** as `maximum_length` or `regex`.
+* A very powerful and easy to use **Dependency Injection** system.
+* Security and authentication, including support for **OAuth2** with **JWT tokens** and **HTTP Basic** auth.
+* More advanced (but equally easy) techniques for declaring **deeply nested JSON models** (thanks to Pydantic).
+* Many extra features (thanks to Starlette) as:
+ * **WebSockets**
+ * **GraphQL**
+ * extremely easy tests based on `requests` and `pytest`
+ * **CORS**
+ * **Cookie Sessions**
+ * ...and more.
+
+## Performance
+
+Independent TechEmpower benchmarks show **FastAPI** applications running under Uvicorn as one of the fastest Python frameworks available, only below Starlette and Uvicorn themselves (used internally by FastAPI). (*)
+
+To understand more about it, see the section Benchmarks.
+
+## Optional Dependencies
+
+Used by Pydantic:
+
+* ujson - for faster JSON "parsing".
+* email_validator - for email validation.
+
+Used by Starlette:
+
+* requests - Required if you want to use the `TestClient`.
+* aiofiles - Required if you want to use `FileResponse` or `StaticFiles`.
+* jinja2 - Required if you want to use the default template configuration.
+* python-multipart - Required if you want to support form "parsing", with `request.form()`.
+* itsdangerous - Required for `SessionMiddleware` support.
+* pyyaml - Required for Starlette's `SchemaGenerator` support (you probably don't need it with FastAPI).
+* graphene - Required for `GraphQLApp` support.
+* ujson - Required if you want to use `UJSONResponse`.
+
+Used by FastAPI / Starlette:
+
+* uvicorn - for the server that loads and serves your application.
+* orjson - Required if you want to use `ORJSONResponse`.
+
+You can install all of these with `pip install fastapi[all]`.
+
+## License
+
+This project is licensed under the terms of the MIT license.
+
diff --git a/.venv/lib/python3.9/site-packages/fastapi-0.63.0.dist-info/RECORD b/.venv/lib/python3.9/site-packages/fastapi-0.63.0.dist-info/RECORD
new file mode 100644
index 0000000..afc6d20
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi-0.63.0.dist-info/RECORD
@@ -0,0 +1,89 @@
+fastapi-0.63.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
+fastapi-0.63.0.dist-info/LICENSE,sha256=Tsif_IFIW5f-xYSy1KlhAy7v_oNEU4lP2cEnSQbMdE4,1086
+fastapi-0.63.0.dist-info/METADATA,sha256=RcIwSNhMxnNo_JBlnZes3m5PCAvJkAQ_IMlcYl0O4rU,22908
+fastapi-0.63.0.dist-info/RECORD,,
+fastapi-0.63.0.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+fastapi-0.63.0.dist-info/WHEEL,sha256=CqyTrkghQBNsEzLD3HbCSEIJ_fY58-XpoU29dUzwHSk,81
+fastapi/__init__.py,sha256=IYUOJjO5_dMSTt-v0rEmKTl-YzoF5W-XA1okAP6-0XA,1015
+fastapi/__pycache__/__init__.cpython-39.pyc,,
+fastapi/__pycache__/applications.cpython-39.pyc,,
+fastapi/__pycache__/background.cpython-39.pyc,,
+fastapi/__pycache__/concurrency.cpython-39.pyc,,
+fastapi/__pycache__/datastructures.cpython-39.pyc,,
+fastapi/__pycache__/encoders.cpython-39.pyc,,
+fastapi/__pycache__/exception_handlers.cpython-39.pyc,,
+fastapi/__pycache__/exceptions.cpython-39.pyc,,
+fastapi/__pycache__/logger.cpython-39.pyc,,
+fastapi/__pycache__/param_functions.cpython-39.pyc,,
+fastapi/__pycache__/params.cpython-39.pyc,,
+fastapi/__pycache__/requests.cpython-39.pyc,,
+fastapi/__pycache__/responses.cpython-39.pyc,,
+fastapi/__pycache__/routing.cpython-39.pyc,,
+fastapi/__pycache__/staticfiles.cpython-39.pyc,,
+fastapi/__pycache__/templating.cpython-39.pyc,,
+fastapi/__pycache__/testclient.cpython-39.pyc,,
+fastapi/__pycache__/types.cpython-39.pyc,,
+fastapi/__pycache__/utils.cpython-39.pyc,,
+fastapi/__pycache__/websockets.cpython-39.pyc,,
+fastapi/applications.py,sha256=D24FMrLOxB7WeZxMaH14YWEg3FnB2Uw8HVcWst1b2CY,31603
+fastapi/background.py,sha256=HtN5_pJJrOdalSbuGSMKJAPNWUU5h7rY_BXXubu7-IQ,76
+fastapi/concurrency.py,sha256=2WhXMOKbv-BDmgorXCdwqmKfMGJekOMCb2x3WagOf6I,1720
+fastapi/datastructures.py,sha256=n_yD3ybdtdwB3cQTCie61RAjYRgxAsIZbwHRDy8Hkgk,1389
+fastapi/dependencies/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+fastapi/dependencies/__pycache__/__init__.cpython-39.pyc,,
+fastapi/dependencies/__pycache__/models.cpython-39.pyc,,
+fastapi/dependencies/__pycache__/utils.cpython-39.pyc,,
+fastapi/dependencies/models.py,sha256=zNbioxICuOeb-9ADDVQ45hUHOC0PBtPVEfVU3f1l_nc,2494
+fastapi/dependencies/utils.py,sha256=6VkX0HiaG-j2zszpe-QFPY9ZH0zh54IxTArLxu3CJ6c,28831
+fastapi/encoders.py,sha256=o4o-qUlgCY1Tmm8QFeuNzvbBVpf0jzJanYMMUaVE8xA,5268
+fastapi/exception_handlers.py,sha256=UVYCCe4qt5-5_NuQ3SxTXjDvOdKMHiTfcLp3RUKXhg8,912
+fastapi/exceptions.py,sha256=KDnOOHp1EQ_Pz4XG9nHKYbf7AcagVwhsa6s72w6IRsQ,1080
+fastapi/logger.py,sha256=I9NNi3ov8AcqbsbC9wl1X-hdItKgYt2XTrx1f99Zpl4,54
+fastapi/middleware/__init__.py,sha256=oQDxiFVcc1fYJUOIFvphnK7pTT5kktmfL32QXpBFvvo,58
+fastapi/middleware/__pycache__/__init__.cpython-39.pyc,,
+fastapi/middleware/__pycache__/cors.cpython-39.pyc,,
+fastapi/middleware/__pycache__/gzip.cpython-39.pyc,,
+fastapi/middleware/__pycache__/httpsredirect.cpython-39.pyc,,
+fastapi/middleware/__pycache__/trustedhost.cpython-39.pyc,,
+fastapi/middleware/__pycache__/wsgi.cpython-39.pyc,,
+fastapi/middleware/cors.py,sha256=ynwjWQZoc_vbhzZ3_ZXceoaSrslHFHPdoM52rXr0WUU,79
+fastapi/middleware/gzip.py,sha256=xM5PcsH8QlAimZw4VDvcmTnqQamslThsfe3CVN2voa0,79
+fastapi/middleware/httpsredirect.py,sha256=rL8eXMnmLijwVkH7_400zHri1AekfeBd6D6qs8ix950,115
+fastapi/middleware/trustedhost.py,sha256=eE5XGRxGa7c5zPnMJDGp3BxaL25k5iVQlhnv-Pk0Pss,109
+fastapi/middleware/wsgi.py,sha256=Z3Ue-7wni4lUZMvH3G9ek__acgYdJstbnpZX_HQAboY,79
+fastapi/openapi/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+fastapi/openapi/__pycache__/__init__.cpython-39.pyc,,
+fastapi/openapi/__pycache__/constants.cpython-39.pyc,,
+fastapi/openapi/__pycache__/docs.cpython-39.pyc,,
+fastapi/openapi/__pycache__/models.cpython-39.pyc,,
+fastapi/openapi/__pycache__/utils.cpython-39.pyc,,
+fastapi/openapi/constants.py,sha256=sJSpZzRp7Kky9R-jucU-K6_pJzLBRO75ddW7-MixZWc,166
+fastapi/openapi/docs.py,sha256=XyDQ4t2Ca95ZN_sSfwjCP3DcwM5Rv21FrwqTfk4x_H4,5538
+fastapi/openapi/models.py,sha256=xv8t-7w2cYFbXr9HtgX3NDxxMDyZ1Dkg5-1XZ3kDX8E,10439
+fastapi/openapi/utils.py,sha256=M51USLtZWu6sVQmLdkKsrodztBZkAxnqdilNMxcrqio,15448
+fastapi/param_functions.py,sha256=Kd3SoRYG0q7C52_Qm5HJ4xpE8nU3jt58b72naJjQTN4,6118
+fastapi/params.py,sha256=t5MXQR1GyH0F9ymjnMp6lr7dFIHnXRzau9dn7PiRVGc,8832
+fastapi/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+fastapi/requests.py,sha256=zayepKFcienBllv3snmWI20Gk0oHNVLU4DDhqXBb4LU,142
+fastapi/responses.py,sha256=I5-0Ao6AwBtIJo3BEJK4vBiYXGs73ZaD8IlolQAcJAI,936
+fastapi/routing.py,sha256=0zbGl-luaLIVeNrW7nZrYlAN7VAfYf1xCrt0dr9f8x0,46856
+fastapi/security/__init__.py,sha256=bO8pNmxqVRXUjfl2mOKiVZLn0FpBQ61VUYVjmppnbJw,881
+fastapi/security/__pycache__/__init__.cpython-39.pyc,,
+fastapi/security/__pycache__/api_key.cpython-39.pyc,,
+fastapi/security/__pycache__/base.cpython-39.pyc,,
+fastapi/security/__pycache__/http.cpython-39.pyc,,
+fastapi/security/__pycache__/oauth2.cpython-39.pyc,,
+fastapi/security/__pycache__/open_id_connect_url.cpython-39.pyc,,
+fastapi/security/__pycache__/utils.cpython-39.pyc,,
+fastapi/security/api_key.py,sha256=WdgOMNWoFbEuQeteeEbifDgjhDEdXj523FE5nEqAI8k,2427
+fastapi/security/base.py,sha256=dl4pvbC-RxjfbWgPtCWd8MVU-7CB2SZ22rJDXVCXO6c,141
+fastapi/security/http.py,sha256=jPDWs2V1pKeXTtCQjhXAG_ZnwTqaX_wq7Mun6uFDl30,5640
+fastapi/security/oauth2.py,sha256=w14ZLUfUWSmHTZnDnynQKMvfXvlYtoswnJ_90yVV6kM,7861
+fastapi/security/open_id_connect_url.py,sha256=vLlY8Ek6H3_QsA4mc_UhSJ8UTp9uMOYxYKn5DD7RmJc,1055
+fastapi/security/utils.py,sha256=izlh-HBaL1VnJeOeRTQnyNgI3hgTFs73eCyLy-snb4A,266
+fastapi/staticfiles.py,sha256=iirGIt3sdY2QZXd36ijs3Cj-T0FuGFda3cd90kM9Ikw,69
+fastapi/templating.py,sha256=4zsuTWgcjcEainMJFAlW6-gnslm6AgOS1SiiDWfmQxk,76
+fastapi/testclient.py,sha256=nBvaAmX66YldReJNZXPOk1sfuo2Q6hs8bOvIaCep6LQ,66
+fastapi/types.py,sha256=r6MngTHzkZOP9lzXgduje9yeZe5EInWAzCLuRJlhIuE,118
+fastapi/utils.py,sha256=g_H9Owy8vbUgY_L4tfYBJRdX9ofIqKPXkhh0LTRLRYE,5545
+fastapi/websockets.py,sha256=SroIkqE-lfChvtRP3mFaNKKtD6TmePDWBZtQfgM4noo,148
diff --git a/.venv/lib/python3.9/site-packages/fastapi-0.63.0.dist-info/REQUESTED b/.venv/lib/python3.9/site-packages/fastapi-0.63.0.dist-info/REQUESTED
new file mode 100644
index 0000000..e69de29
diff --git a/.venv/lib/python3.9/site-packages/fastapi-0.63.0.dist-info/WHEEL b/.venv/lib/python3.9/site-packages/fastapi-0.63.0.dist-info/WHEEL
new file mode 100644
index 0000000..b767d46
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi-0.63.0.dist-info/WHEEL
@@ -0,0 +1,4 @@
+Wheel-Version: 1.0
+Generator: flit 3.0.0
+Root-Is-Purelib: true
+Tag: py3-none-any
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__init__.py b/.venv/lib/python3.9/site-packages/fastapi/__init__.py
new file mode 100644
index 0000000..c0bb0e4
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/__init__.py
@@ -0,0 +1,24 @@
+"""FastAPI framework, high performance, easy to learn, fast to code, ready for production"""
+
+__version__ = "0.63.0"
+
+from starlette import status as status
+
+from .applications import FastAPI as FastAPI
+from .background import BackgroundTasks as BackgroundTasks
+from .datastructures import UploadFile as UploadFile
+from .exceptions import HTTPException as HTTPException
+from .param_functions import Body as Body
+from .param_functions import Cookie as Cookie
+from .param_functions import Depends as Depends
+from .param_functions import File as File
+from .param_functions import Form as Form
+from .param_functions import Header as Header
+from .param_functions import Path as Path
+from .param_functions import Query as Query
+from .param_functions import Security as Security
+from .requests import Request as Request
+from .responses import Response as Response
+from .routing import APIRouter as APIRouter
+from .websockets import WebSocket as WebSocket
+from .websockets import WebSocketDisconnect as WebSocketDisconnect
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..5cbd4ab
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/applications.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/applications.cpython-39.pyc
new file mode 100644
index 0000000..1962550
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/applications.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/background.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/background.cpython-39.pyc
new file mode 100644
index 0000000..be06e5e
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/background.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/concurrency.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/concurrency.cpython-39.pyc
new file mode 100644
index 0000000..3052cd2
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/concurrency.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/datastructures.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/datastructures.cpython-39.pyc
new file mode 100644
index 0000000..63c54cc
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/datastructures.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/encoders.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/encoders.cpython-39.pyc
new file mode 100644
index 0000000..f70a635
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/encoders.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/exception_handlers.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/exception_handlers.cpython-39.pyc
new file mode 100644
index 0000000..df2e56b
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/exception_handlers.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/exceptions.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/exceptions.cpython-39.pyc
new file mode 100644
index 0000000..a9d70a5
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/exceptions.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/logger.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/logger.cpython-39.pyc
new file mode 100644
index 0000000..6984aaf
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/logger.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/param_functions.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/param_functions.cpython-39.pyc
new file mode 100644
index 0000000..23a3dc9
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/param_functions.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/params.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/params.cpython-39.pyc
new file mode 100644
index 0000000..8d73dc7
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/params.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/requests.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/requests.cpython-39.pyc
new file mode 100644
index 0000000..c9f16c9
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/requests.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/responses.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/responses.cpython-39.pyc
new file mode 100644
index 0000000..bbb60d2
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/responses.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/routing.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/routing.cpython-39.pyc
new file mode 100644
index 0000000..383b341
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/routing.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/staticfiles.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/staticfiles.cpython-39.pyc
new file mode 100644
index 0000000..5841a50
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/staticfiles.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/templating.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/templating.cpython-39.pyc
new file mode 100644
index 0000000..698c169
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/templating.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/testclient.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/testclient.cpython-39.pyc
new file mode 100644
index 0000000..4888739
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/testclient.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/types.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/types.cpython-39.pyc
new file mode 100644
index 0000000..19f71e8
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/types.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/utils.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/utils.cpython-39.pyc
new file mode 100644
index 0000000..ee54a25
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/utils.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/__pycache__/websockets.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/websockets.cpython-39.pyc
new file mode 100644
index 0000000..dce86bc
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/__pycache__/websockets.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/applications.py b/.venv/lib/python3.9/site-packages/fastapi/applications.py
new file mode 100644
index 0000000..92d041c
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/applications.py
@@ -0,0 +1,739 @@
+from typing import Any, Callable, Coroutine, Dict, List, Optional, Sequence, Type, Union
+
+from fastapi import routing
+from fastapi.concurrency import AsyncExitStack
+from fastapi.datastructures import Default, DefaultPlaceholder
+from fastapi.encoders import DictIntStrAny, SetIntStr
+from fastapi.exception_handlers import (
+ http_exception_handler,
+ request_validation_exception_handler,
+)
+from fastapi.exceptions import RequestValidationError
+from fastapi.logger import logger
+from fastapi.openapi.docs import (
+ get_redoc_html,
+ get_swagger_ui_html,
+ get_swagger_ui_oauth2_redirect_html,
+)
+from fastapi.openapi.utils import get_openapi
+from fastapi.params import Depends
+from fastapi.types import DecoratedCallable
+from starlette.applications import Starlette
+from starlette.datastructures import State
+from starlette.exceptions import HTTPException
+from starlette.middleware import Middleware
+from starlette.requests import Request
+from starlette.responses import HTMLResponse, JSONResponse, Response
+from starlette.routing import BaseRoute
+from starlette.types import ASGIApp, Receive, Scope, Send
+
+
+class FastAPI(Starlette):
+ def __init__(
+ self,
+ *,
+ debug: bool = False,
+ routes: Optional[List[BaseRoute]] = None,
+ title: str = "FastAPI",
+ description: str = "",
+ version: str = "0.1.0",
+ openapi_url: Optional[str] = "/openapi.json",
+ openapi_tags: Optional[List[Dict[str, Any]]] = None,
+ servers: Optional[List[Dict[str, Union[str, Any]]]] = None,
+ dependencies: Optional[Sequence[Depends]] = None,
+ default_response_class: Type[Response] = Default(JSONResponse),
+ docs_url: Optional[str] = "/docs",
+ redoc_url: Optional[str] = "/redoc",
+ swagger_ui_oauth2_redirect_url: Optional[str] = "/docs/oauth2-redirect",
+ swagger_ui_init_oauth: Optional[Dict[str, Any]] = None,
+ middleware: Optional[Sequence[Middleware]] = None,
+ exception_handlers: Optional[
+ Dict[
+ Union[int, Type[Exception]],
+ Callable[[Request, Any], Coroutine[Any, Any, Response]],
+ ]
+ ] = None,
+ on_startup: Optional[Sequence[Callable[[], Any]]] = None,
+ on_shutdown: Optional[Sequence[Callable[[], Any]]] = None,
+ openapi_prefix: str = "",
+ root_path: str = "",
+ root_path_in_servers: bool = True,
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ deprecated: Optional[bool] = None,
+ include_in_schema: bool = True,
+ **extra: Any,
+ ) -> None:
+ self._debug: bool = debug
+ self.state: State = State()
+ self.router: routing.APIRouter = routing.APIRouter(
+ routes=routes,
+ dependency_overrides_provider=self,
+ on_startup=on_startup,
+ on_shutdown=on_shutdown,
+ default_response_class=default_response_class,
+ dependencies=dependencies,
+ callbacks=callbacks,
+ deprecated=deprecated,
+ include_in_schema=include_in_schema,
+ responses=responses,
+ )
+ self.exception_handlers: Dict[
+ Union[int, Type[Exception]],
+ Callable[[Request, Any], Coroutine[Any, Any, Response]],
+ ] = (
+ {} if exception_handlers is None else dict(exception_handlers)
+ )
+ self.exception_handlers.setdefault(HTTPException, http_exception_handler)
+ self.exception_handlers.setdefault(
+ RequestValidationError, request_validation_exception_handler
+ )
+
+ self.user_middleware: List[Middleware] = (
+ [] if middleware is None else list(middleware)
+ )
+ self.middleware_stack: ASGIApp = self.build_middleware_stack()
+
+ self.title = title
+ self.description = description
+ self.version = version
+ self.servers = servers or []
+ self.openapi_url = openapi_url
+ self.openapi_tags = openapi_tags
+ # TODO: remove when discarding the openapi_prefix parameter
+ if openapi_prefix:
+ logger.warning(
+ '"openapi_prefix" has been deprecated in favor of "root_path", which '
+ "follows more closely the ASGI standard, is simpler, and more "
+ "automatic. Check the docs at "
+ "https://fastapi.tiangolo.com/advanced/sub-applications/"
+ )
+ self.root_path = root_path or openapi_prefix
+ self.root_path_in_servers = root_path_in_servers
+ self.docs_url = docs_url
+ self.redoc_url = redoc_url
+ self.swagger_ui_oauth2_redirect_url = swagger_ui_oauth2_redirect_url
+ self.swagger_ui_init_oauth = swagger_ui_init_oauth
+ self.extra = extra
+ self.dependency_overrides: Dict[Callable[..., Any], Callable[..., Any]] = {}
+
+ self.openapi_version = "3.0.2"
+
+ if self.openapi_url:
+ assert self.title, "A title must be provided for OpenAPI, e.g.: 'My API'"
+ assert self.version, "A version must be provided for OpenAPI, e.g.: '2.1.0'"
+ self.openapi_schema: Optional[Dict[str, Any]] = None
+ self.setup()
+
+ def openapi(self) -> Dict[str, Any]:
+ if not self.openapi_schema:
+ self.openapi_schema = get_openapi(
+ title=self.title,
+ version=self.version,
+ openapi_version=self.openapi_version,
+ description=self.description,
+ routes=self.routes,
+ tags=self.openapi_tags,
+ servers=self.servers,
+ )
+ return self.openapi_schema
+
+ def setup(self) -> None:
+ if self.openapi_url:
+ urls = (server_data.get("url") for server_data in self.servers)
+ server_urls = {url for url in urls if url}
+
+ async def openapi(req: Request) -> JSONResponse:
+ root_path = req.scope.get("root_path", "").rstrip("/")
+ if root_path not in server_urls:
+ if root_path and self.root_path_in_servers:
+ self.servers.insert(0, {"url": root_path})
+ server_urls.add(root_path)
+ return JSONResponse(self.openapi())
+
+ self.add_route(self.openapi_url, openapi, include_in_schema=False)
+ if self.openapi_url and self.docs_url:
+
+ async def swagger_ui_html(req: Request) -> HTMLResponse:
+ root_path = req.scope.get("root_path", "").rstrip("/")
+ openapi_url = root_path + self.openapi_url
+ oauth2_redirect_url = self.swagger_ui_oauth2_redirect_url
+ if oauth2_redirect_url:
+ oauth2_redirect_url = root_path + oauth2_redirect_url
+ return get_swagger_ui_html(
+ openapi_url=openapi_url,
+ title=self.title + " - Swagger UI",
+ oauth2_redirect_url=oauth2_redirect_url,
+ init_oauth=self.swagger_ui_init_oauth,
+ )
+
+ self.add_route(self.docs_url, swagger_ui_html, include_in_schema=False)
+
+ if self.swagger_ui_oauth2_redirect_url:
+
+ async def swagger_ui_redirect(req: Request) -> HTMLResponse:
+ return get_swagger_ui_oauth2_redirect_html()
+
+ self.add_route(
+ self.swagger_ui_oauth2_redirect_url,
+ swagger_ui_redirect,
+ include_in_schema=False,
+ )
+ if self.openapi_url and self.redoc_url:
+
+ async def redoc_html(req: Request) -> HTMLResponse:
+ root_path = req.scope.get("root_path", "").rstrip("/")
+ openapi_url = root_path + self.openapi_url
+ return get_redoc_html(
+ openapi_url=openapi_url, title=self.title + " - ReDoc"
+ )
+
+ self.add_route(self.redoc_url, redoc_html, include_in_schema=False)
+
+ async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:
+ if self.root_path:
+ scope["root_path"] = self.root_path
+ if AsyncExitStack:
+ async with AsyncExitStack() as stack:
+ scope["fastapi_astack"] = stack
+ await super().__call__(scope, receive, send)
+ else:
+ await super().__call__(scope, receive, send) # pragma: no cover
+
+ def add_api_route(
+ self,
+ path: str,
+ endpoint: Callable[..., Coroutine[Any, Any, Response]],
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ methods: Optional[List[str]] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Union[Type[Response], DefaultPlaceholder] = Default(
+ JSONResponse
+ ),
+ name: Optional[str] = None,
+ ) -> None:
+ self.router.add_api_route(
+ path,
+ endpoint=endpoint,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ methods=methods,
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ )
+
+ def api_route(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ methods: Optional[List[str]] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ def decorator(func: DecoratedCallable) -> DecoratedCallable:
+ self.router.add_api_route(
+ path,
+ func,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ methods=methods,
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ )
+ return func
+
+ return decorator
+
+ def add_api_websocket_route(
+ self, path: str, endpoint: Callable[..., Any], name: Optional[str] = None
+ ) -> None:
+ self.router.add_api_websocket_route(path, endpoint, name=name)
+
+ def websocket(
+ self, path: str, name: Optional[str] = None
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ def decorator(func: DecoratedCallable) -> DecoratedCallable:
+ self.add_api_websocket_route(path, func, name=name)
+ return func
+
+ return decorator
+
+ def include_router(
+ self,
+ router: routing.APIRouter,
+ *,
+ prefix: str = "",
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[Depends]] = None,
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ include_in_schema: bool = True,
+ default_response_class: Type[Response] = Default(JSONResponse),
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> None:
+ self.router.include_router(
+ router,
+ prefix=prefix,
+ tags=tags,
+ dependencies=dependencies,
+ responses=responses,
+ deprecated=deprecated,
+ include_in_schema=include_in_schema,
+ default_response_class=default_response_class,
+ callbacks=callbacks,
+ )
+
+ def get(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ return self.router.get(
+ path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
+
+ def put(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ return self.router.put(
+ path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
+
+ def post(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ return self.router.post(
+ path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
+
+ def delete(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ return self.router.delete(
+ path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ operation_id=operation_id,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
+
+ def options(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ return self.router.options(
+ path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
+
+ def head(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ return self.router.head(
+ path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
+
+ def patch(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ return self.router.patch(
+ path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
+
+ def trace(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ return self.router.trace(
+ path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
diff --git a/.venv/lib/python3.9/site-packages/fastapi/background.py b/.venv/lib/python3.9/site-packages/fastapi/background.py
new file mode 100644
index 0000000..dd3bbe2
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/background.py
@@ -0,0 +1 @@
+from starlette.background import BackgroundTasks as BackgroundTasks # noqa
diff --git a/.venv/lib/python3.9/site-packages/fastapi/concurrency.py b/.venv/lib/python3.9/site-packages/fastapi/concurrency.py
new file mode 100644
index 0000000..d1fdfe5
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/concurrency.py
@@ -0,0 +1,51 @@
+from typing import Any, Callable
+
+from starlette.concurrency import iterate_in_threadpool as iterate_in_threadpool # noqa
+from starlette.concurrency import run_in_threadpool as run_in_threadpool # noqa
+from starlette.concurrency import ( # noqa
+ run_until_first_complete as run_until_first_complete,
+)
+
+asynccontextmanager_error_message = """
+FastAPI's contextmanager_in_threadpool require Python 3.7 or above,
+or the backport for Python 3.6, installed with:
+ pip install async-generator
+"""
+
+
+def _fake_asynccontextmanager(func: Callable[..., Any]) -> Callable[..., Any]:
+ def raiser(*args: Any, **kwargs: Any) -> Any:
+ raise RuntimeError(asynccontextmanager_error_message)
+
+ return raiser
+
+
+try:
+ from contextlib import asynccontextmanager as asynccontextmanager # type: ignore
+except ImportError:
+ try:
+ from async_generator import ( # type: ignore # isort: skip
+ asynccontextmanager as asynccontextmanager,
+ )
+ except ImportError: # pragma: no cover
+ asynccontextmanager = _fake_asynccontextmanager
+
+try:
+ from contextlib import AsyncExitStack as AsyncExitStack # type: ignore
+except ImportError:
+ try:
+ from async_exit_stack import AsyncExitStack as AsyncExitStack # type: ignore
+ except ImportError: # pragma: no cover
+ AsyncExitStack = None # type: ignore
+
+
+@asynccontextmanager # type: ignore
+async def contextmanager_in_threadpool(cm: Any) -> Any:
+ try:
+ yield await run_in_threadpool(cm.__enter__)
+ except Exception as e:
+ ok = await run_in_threadpool(cm.__exit__, type(e), e, None)
+ if not ok:
+ raise e
+ else:
+ await run_in_threadpool(cm.__exit__, None, None, None)
diff --git a/.venv/lib/python3.9/site-packages/fastapi/datastructures.py b/.venv/lib/python3.9/site-packages/fastapi/datastructures.py
new file mode 100644
index 0000000..f22409c
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/datastructures.py
@@ -0,0 +1,47 @@
+from typing import Any, Callable, Iterable, Type, TypeVar
+
+from starlette.datastructures import State as State # noqa: F401
+from starlette.datastructures import UploadFile as StarletteUploadFile
+
+
+class UploadFile(StarletteUploadFile):
+ @classmethod
+ def __get_validators__(cls: Type["UploadFile"]) -> Iterable[Callable[..., Any]]:
+ yield cls.validate
+
+ @classmethod
+ def validate(cls: Type["UploadFile"], v: Any) -> Any:
+ if not isinstance(v, StarletteUploadFile):
+ raise ValueError(f"Expected UploadFile, received: {type(v)}")
+ return v
+
+
+class DefaultPlaceholder:
+ """
+ You shouldn't use this class directly.
+
+ It's used internally to recognize when a default value has been overwritten, even
+ if the overriden default value was truthy.
+ """
+
+ def __init__(self, value: Any):
+ self.value = value
+
+ def __bool__(self) -> bool:
+ return bool(self.value)
+
+ def __eq__(self, o: object) -> bool:
+ return isinstance(o, DefaultPlaceholder) and o.value == self.value
+
+
+DefaultType = TypeVar("DefaultType")
+
+
+def Default(value: DefaultType) -> DefaultType:
+ """
+ You shouldn't use this function directly.
+
+ It's used internally to recognize when a default value has been overwritten, even
+ if the overriden default value was truthy.
+ """
+ return DefaultPlaceholder(value) # type: ignore
diff --git a/.venv/lib/python3.9/site-packages/fastapi/dependencies/__init__.py b/.venv/lib/python3.9/site-packages/fastapi/dependencies/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/.venv/lib/python3.9/site-packages/fastapi/dependencies/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/dependencies/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..daf783a
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/dependencies/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/dependencies/__pycache__/models.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/dependencies/__pycache__/models.cpython-39.pyc
new file mode 100644
index 0000000..dfae577
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/dependencies/__pycache__/models.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/dependencies/__pycache__/utils.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/dependencies/__pycache__/utils.cpython-39.pyc
new file mode 100644
index 0000000..db4e984
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/dependencies/__pycache__/utils.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/dependencies/models.py b/.venv/lib/python3.9/site-packages/fastapi/dependencies/models.py
new file mode 100644
index 0000000..443590b
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/dependencies/models.py
@@ -0,0 +1,58 @@
+from typing import Any, Callable, List, Optional, Sequence
+
+from fastapi.security.base import SecurityBase
+from pydantic.fields import ModelField
+
+
+class SecurityRequirement:
+ def __init__(
+ self, security_scheme: SecurityBase, scopes: Optional[Sequence[str]] = None
+ ):
+ self.security_scheme = security_scheme
+ self.scopes = scopes
+
+
+class Dependant:
+ def __init__(
+ self,
+ *,
+ path_params: Optional[List[ModelField]] = None,
+ query_params: Optional[List[ModelField]] = None,
+ header_params: Optional[List[ModelField]] = None,
+ cookie_params: Optional[List[ModelField]] = None,
+ body_params: Optional[List[ModelField]] = None,
+ dependencies: Optional[List["Dependant"]] = None,
+ security_schemes: Optional[List[SecurityRequirement]] = None,
+ name: Optional[str] = None,
+ call: Optional[Callable[..., Any]] = None,
+ request_param_name: Optional[str] = None,
+ websocket_param_name: Optional[str] = None,
+ http_connection_param_name: Optional[str] = None,
+ response_param_name: Optional[str] = None,
+ background_tasks_param_name: Optional[str] = None,
+ security_scopes_param_name: Optional[str] = None,
+ security_scopes: Optional[List[str]] = None,
+ use_cache: bool = True,
+ path: Optional[str] = None,
+ ) -> None:
+ self.path_params = path_params or []
+ self.query_params = query_params or []
+ self.header_params = header_params or []
+ self.cookie_params = cookie_params or []
+ self.body_params = body_params or []
+ self.dependencies = dependencies or []
+ self.security_requirements = security_schemes or []
+ self.request_param_name = request_param_name
+ self.websocket_param_name = websocket_param_name
+ self.http_connection_param_name = http_connection_param_name
+ self.response_param_name = response_param_name
+ self.background_tasks_param_name = background_tasks_param_name
+ self.security_scopes = security_scopes
+ self.security_scopes_param_name = security_scopes_param_name
+ self.name = name
+ self.call = call
+ self.use_cache = use_cache
+ # Store the path to be able to re-generate a dependable from it in overrides
+ self.path = path
+ # Save the cache key at creation to optimize performance
+ self.cache_key = (self.call, tuple(sorted(set(self.security_scopes or []))))
diff --git a/.venv/lib/python3.9/site-packages/fastapi/dependencies/utils.py b/.venv/lib/python3.9/site-packages/fastapi/dependencies/utils.py
new file mode 100644
index 0000000..fcfaa2c
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/dependencies/utils.py
@@ -0,0 +1,783 @@
+import asyncio
+import inspect
+from contextlib import contextmanager
+from copy import deepcopy
+from typing import (
+ Any,
+ Callable,
+ Dict,
+ List,
+ Mapping,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+ cast,
+)
+
+from fastapi import params
+from fastapi.concurrency import (
+ AsyncExitStack,
+ _fake_asynccontextmanager,
+ asynccontextmanager,
+ contextmanager_in_threadpool,
+)
+from fastapi.dependencies.models import Dependant, SecurityRequirement
+from fastapi.logger import logger
+from fastapi.security.base import SecurityBase
+from fastapi.security.oauth2 import OAuth2, SecurityScopes
+from fastapi.security.open_id_connect_url import OpenIdConnect
+from fastapi.utils import create_response_field, get_path_param_names
+from pydantic import BaseModel, create_model
+from pydantic.error_wrappers import ErrorWrapper
+from pydantic.errors import MissingError
+from pydantic.fields import (
+ SHAPE_LIST,
+ SHAPE_SEQUENCE,
+ SHAPE_SET,
+ SHAPE_SINGLETON,
+ SHAPE_TUPLE,
+ SHAPE_TUPLE_ELLIPSIS,
+ FieldInfo,
+ ModelField,
+ Required,
+)
+from pydantic.schema import get_annotation_from_field_info
+from pydantic.typing import ForwardRef, evaluate_forwardref
+from pydantic.utils import lenient_issubclass
+from starlette.background import BackgroundTasks
+from starlette.concurrency import run_in_threadpool
+from starlette.datastructures import FormData, Headers, QueryParams, UploadFile
+from starlette.requests import HTTPConnection, Request
+from starlette.responses import Response
+from starlette.websockets import WebSocket
+
+sequence_shapes = {
+ SHAPE_LIST,
+ SHAPE_SET,
+ SHAPE_TUPLE,
+ SHAPE_SEQUENCE,
+ SHAPE_TUPLE_ELLIPSIS,
+}
+sequence_types = (list, set, tuple)
+sequence_shape_to_type = {
+ SHAPE_LIST: list,
+ SHAPE_SET: set,
+ SHAPE_TUPLE: tuple,
+ SHAPE_SEQUENCE: list,
+ SHAPE_TUPLE_ELLIPSIS: list,
+}
+
+
+multipart_not_installed_error = (
+ 'Form data requires "python-multipart" to be installed. \n'
+ 'You can install "python-multipart" with: \n\n'
+ "pip install python-multipart\n"
+)
+multipart_incorrect_install_error = (
+ 'Form data requires "python-multipart" to be installed. '
+ 'It seems you installed "multipart" instead. \n'
+ 'You can remove "multipart" with: \n\n'
+ "pip uninstall multipart\n\n"
+ 'And then install "python-multipart" with: \n\n'
+ "pip install python-multipart\n"
+)
+
+
+def check_file_field(field: ModelField) -> None:
+ field_info = field.field_info
+ if isinstance(field_info, params.Form):
+ try:
+ # __version__ is available in both multiparts, and can be mocked
+ from multipart import __version__ # type: ignore
+
+ assert __version__
+ try:
+ # parse_options_header is only available in the right multipart
+ from multipart.multipart import parse_options_header # type: ignore
+
+ assert parse_options_header
+ except ImportError:
+ logger.error(multipart_incorrect_install_error)
+ raise RuntimeError(multipart_incorrect_install_error)
+ except ImportError:
+ logger.error(multipart_not_installed_error)
+ raise RuntimeError(multipart_not_installed_error)
+
+
+def get_param_sub_dependant(
+ *, param: inspect.Parameter, path: str, security_scopes: Optional[List[str]] = None
+) -> Dependant:
+ depends: params.Depends = param.default
+ if depends.dependency:
+ dependency = depends.dependency
+ else:
+ dependency = param.annotation
+ return get_sub_dependant(
+ depends=depends,
+ dependency=dependency,
+ path=path,
+ name=param.name,
+ security_scopes=security_scopes,
+ )
+
+
+def get_parameterless_sub_dependant(*, depends: params.Depends, path: str) -> Dependant:
+ assert callable(
+ depends.dependency
+ ), "A parameter-less dependency must have a callable dependency"
+ return get_sub_dependant(depends=depends, dependency=depends.dependency, path=path)
+
+
+def get_sub_dependant(
+ *,
+ depends: params.Depends,
+ dependency: Callable[..., Any],
+ path: str,
+ name: Optional[str] = None,
+ security_scopes: Optional[List[str]] = None,
+) -> Dependant:
+ security_requirement = None
+ security_scopes = security_scopes or []
+ if isinstance(depends, params.Security):
+ dependency_scopes = depends.scopes
+ security_scopes.extend(dependency_scopes)
+ if isinstance(dependency, SecurityBase):
+ use_scopes: List[str] = []
+ if isinstance(dependency, (OAuth2, OpenIdConnect)):
+ use_scopes = security_scopes
+ security_requirement = SecurityRequirement(
+ security_scheme=dependency, scopes=use_scopes
+ )
+ sub_dependant = get_dependant(
+ path=path,
+ call=dependency,
+ name=name,
+ security_scopes=security_scopes,
+ use_cache=depends.use_cache,
+ )
+ if security_requirement:
+ sub_dependant.security_requirements.append(security_requirement)
+ sub_dependant.security_scopes = security_scopes
+ return sub_dependant
+
+
+CacheKey = Tuple[Optional[Callable[..., Any]], Tuple[str, ...]]
+
+
+def get_flat_dependant(
+ dependant: Dependant,
+ *,
+ skip_repeats: bool = False,
+ visited: Optional[List[CacheKey]] = None,
+) -> Dependant:
+ if visited is None:
+ visited = []
+ visited.append(dependant.cache_key)
+
+ flat_dependant = Dependant(
+ path_params=dependant.path_params.copy(),
+ query_params=dependant.query_params.copy(),
+ header_params=dependant.header_params.copy(),
+ cookie_params=dependant.cookie_params.copy(),
+ body_params=dependant.body_params.copy(),
+ security_schemes=dependant.security_requirements.copy(),
+ use_cache=dependant.use_cache,
+ path=dependant.path,
+ )
+ for sub_dependant in dependant.dependencies:
+ if skip_repeats and sub_dependant.cache_key in visited:
+ continue
+ flat_sub = get_flat_dependant(
+ sub_dependant, skip_repeats=skip_repeats, visited=visited
+ )
+ flat_dependant.path_params.extend(flat_sub.path_params)
+ flat_dependant.query_params.extend(flat_sub.query_params)
+ flat_dependant.header_params.extend(flat_sub.header_params)
+ flat_dependant.cookie_params.extend(flat_sub.cookie_params)
+ flat_dependant.body_params.extend(flat_sub.body_params)
+ flat_dependant.security_requirements.extend(flat_sub.security_requirements)
+ return flat_dependant
+
+
+def get_flat_params(dependant: Dependant) -> List[ModelField]:
+ flat_dependant = get_flat_dependant(dependant, skip_repeats=True)
+ return (
+ flat_dependant.path_params
+ + flat_dependant.query_params
+ + flat_dependant.header_params
+ + flat_dependant.cookie_params
+ )
+
+
+def is_scalar_field(field: ModelField) -> bool:
+ field_info = field.field_info
+ if not (
+ field.shape == SHAPE_SINGLETON
+ and not lenient_issubclass(field.type_, BaseModel)
+ and not lenient_issubclass(field.type_, sequence_types + (dict,))
+ and not isinstance(field_info, params.Body)
+ ):
+ return False
+ if field.sub_fields:
+ if not all(is_scalar_field(f) for f in field.sub_fields):
+ return False
+ return True
+
+
+def is_scalar_sequence_field(field: ModelField) -> bool:
+ if (field.shape in sequence_shapes) and not lenient_issubclass(
+ field.type_, BaseModel
+ ):
+ if field.sub_fields is not None:
+ for sub_field in field.sub_fields:
+ if not is_scalar_field(sub_field):
+ return False
+ return True
+ if lenient_issubclass(field.type_, sequence_types):
+ return True
+ return False
+
+
+def get_typed_signature(call: Callable[..., Any]) -> inspect.Signature:
+ signature = inspect.signature(call)
+ globalns = getattr(call, "__globals__", {})
+ typed_params = [
+ inspect.Parameter(
+ name=param.name,
+ kind=param.kind,
+ default=param.default,
+ annotation=get_typed_annotation(param, globalns),
+ )
+ for param in signature.parameters.values()
+ ]
+ typed_signature = inspect.Signature(typed_params)
+ return typed_signature
+
+
+def get_typed_annotation(param: inspect.Parameter, globalns: Dict[str, Any]) -> Any:
+ annotation = param.annotation
+ if isinstance(annotation, str):
+ annotation = ForwardRef(annotation)
+ annotation = evaluate_forwardref(annotation, globalns, globalns)
+ return annotation
+
+
+async_contextmanager_dependencies_error = """
+FastAPI dependencies with yield require Python 3.7 or above,
+or the backports for Python 3.6, installed with:
+ pip install async-exit-stack async-generator
+"""
+
+
+def check_dependency_contextmanagers() -> None:
+ if AsyncExitStack is None or asynccontextmanager == _fake_asynccontextmanager:
+ raise RuntimeError(async_contextmanager_dependencies_error) # pragma: no cover
+
+
+def get_dependant(
+ *,
+ path: str,
+ call: Callable[..., Any],
+ name: Optional[str] = None,
+ security_scopes: Optional[List[str]] = None,
+ use_cache: bool = True,
+) -> Dependant:
+ path_param_names = get_path_param_names(path)
+ endpoint_signature = get_typed_signature(call)
+ signature_params = endpoint_signature.parameters
+ if is_gen_callable(call) or is_async_gen_callable(call):
+ check_dependency_contextmanagers()
+ dependant = Dependant(call=call, name=name, path=path, use_cache=use_cache)
+ for param_name, param in signature_params.items():
+ if isinstance(param.default, params.Depends):
+ sub_dependant = get_param_sub_dependant(
+ param=param, path=path, security_scopes=security_scopes
+ )
+ dependant.dependencies.append(sub_dependant)
+ continue
+ if add_non_field_param_to_dependency(param=param, dependant=dependant):
+ continue
+ param_field = get_param_field(
+ param=param, default_field_info=params.Query, param_name=param_name
+ )
+ if param_name in path_param_names:
+ assert is_scalar_field(
+ field=param_field
+ ), "Path params must be of one of the supported types"
+ if isinstance(param.default, params.Path):
+ ignore_default = False
+ else:
+ ignore_default = True
+ param_field = get_param_field(
+ param=param,
+ param_name=param_name,
+ default_field_info=params.Path,
+ force_type=params.ParamTypes.path,
+ ignore_default=ignore_default,
+ )
+ add_param_to_fields(field=param_field, dependant=dependant)
+ elif is_scalar_field(field=param_field):
+ add_param_to_fields(field=param_field, dependant=dependant)
+ elif isinstance(
+ param.default, (params.Query, params.Header)
+ ) and is_scalar_sequence_field(param_field):
+ add_param_to_fields(field=param_field, dependant=dependant)
+ else:
+ field_info = param_field.field_info
+ assert isinstance(
+ field_info, params.Body
+ ), f"Param: {param_field.name} can only be a request body, using Body(...)"
+ dependant.body_params.append(param_field)
+ return dependant
+
+
+def add_non_field_param_to_dependency(
+ *, param: inspect.Parameter, dependant: Dependant
+) -> Optional[bool]:
+ if lenient_issubclass(param.annotation, Request):
+ dependant.request_param_name = param.name
+ return True
+ elif lenient_issubclass(param.annotation, WebSocket):
+ dependant.websocket_param_name = param.name
+ return True
+ elif lenient_issubclass(param.annotation, HTTPConnection):
+ dependant.http_connection_param_name = param.name
+ return True
+ elif lenient_issubclass(param.annotation, Response):
+ dependant.response_param_name = param.name
+ return True
+ elif lenient_issubclass(param.annotation, BackgroundTasks):
+ dependant.background_tasks_param_name = param.name
+ return True
+ elif lenient_issubclass(param.annotation, SecurityScopes):
+ dependant.security_scopes_param_name = param.name
+ return True
+ return None
+
+
+def get_param_field(
+ *,
+ param: inspect.Parameter,
+ param_name: str,
+ default_field_info: Type[params.Param] = params.Param,
+ force_type: Optional[params.ParamTypes] = None,
+ ignore_default: bool = False,
+) -> ModelField:
+ default_value = Required
+ had_schema = False
+ if not param.default == param.empty and ignore_default is False:
+ default_value = param.default
+ if isinstance(default_value, FieldInfo):
+ had_schema = True
+ field_info = default_value
+ default_value = field_info.default
+ if (
+ isinstance(field_info, params.Param)
+ and getattr(field_info, "in_", None) is None
+ ):
+ field_info.in_ = default_field_info.in_
+ if force_type:
+ field_info.in_ = force_type # type: ignore
+ else:
+ field_info = default_field_info(default_value)
+ required = default_value == Required
+ annotation: Any = Any
+ if not param.annotation == param.empty:
+ annotation = param.annotation
+ annotation = get_annotation_from_field_info(annotation, field_info, param_name)
+ if not field_info.alias and getattr(field_info, "convert_underscores", None):
+ alias = param.name.replace("_", "-")
+ else:
+ alias = field_info.alias or param.name
+ field = create_response_field(
+ name=param.name,
+ type_=annotation,
+ default=None if required else default_value,
+ alias=alias,
+ required=required,
+ field_info=field_info,
+ )
+ field.required = required
+ if not had_schema and not is_scalar_field(field=field):
+ field.field_info = params.Body(field_info.default)
+
+ return field
+
+
+def add_param_to_fields(*, field: ModelField, dependant: Dependant) -> None:
+ field_info = cast(params.Param, field.field_info)
+ if field_info.in_ == params.ParamTypes.path:
+ dependant.path_params.append(field)
+ elif field_info.in_ == params.ParamTypes.query:
+ dependant.query_params.append(field)
+ elif field_info.in_ == params.ParamTypes.header:
+ dependant.header_params.append(field)
+ else:
+ assert (
+ field_info.in_ == params.ParamTypes.cookie
+ ), f"non-body parameters must be in path, query, header or cookie: {field.name}"
+ dependant.cookie_params.append(field)
+
+
+def is_coroutine_callable(call: Callable[..., Any]) -> bool:
+ if inspect.isroutine(call):
+ return inspect.iscoroutinefunction(call)
+ if inspect.isclass(call):
+ return False
+ call = getattr(call, "__call__", None)
+ return inspect.iscoroutinefunction(call)
+
+
+def is_async_gen_callable(call: Callable[..., Any]) -> bool:
+ if inspect.isasyncgenfunction(call):
+ return True
+ call = getattr(call, "__call__", None)
+ return inspect.isasyncgenfunction(call)
+
+
+def is_gen_callable(call: Callable[..., Any]) -> bool:
+ if inspect.isgeneratorfunction(call):
+ return True
+ call = getattr(call, "__call__", None)
+ return inspect.isgeneratorfunction(call)
+
+
+async def solve_generator(
+ *, call: Callable[..., Any], stack: AsyncExitStack, sub_values: Dict[str, Any]
+) -> Any:
+ if is_gen_callable(call):
+ cm = contextmanager_in_threadpool(contextmanager(call)(**sub_values))
+ elif is_async_gen_callable(call):
+ if not inspect.isasyncgenfunction(call):
+ # asynccontextmanager from the async_generator backfill pre python3.7
+ # does not support callables that are not functions or methods.
+ # See https://github.com/python-trio/async_generator/issues/32
+ #
+ # Expand the callable class into its __call__ method before decorating it.
+ # This approach will work on newer python versions as well.
+ call = getattr(call, "__call__", None)
+ cm = asynccontextmanager(call)(**sub_values)
+ return await stack.enter_async_context(cm)
+
+
+async def solve_dependencies(
+ *,
+ request: Union[Request, WebSocket],
+ dependant: Dependant,
+ body: Optional[Union[Dict[str, Any], FormData]] = None,
+ background_tasks: Optional[BackgroundTasks] = None,
+ response: Optional[Response] = None,
+ dependency_overrides_provider: Optional[Any] = None,
+ dependency_cache: Optional[Dict[Tuple[Callable[..., Any], Tuple[str]], Any]] = None,
+) -> Tuple[
+ Dict[str, Any],
+ List[ErrorWrapper],
+ Optional[BackgroundTasks],
+ Response,
+ Dict[Tuple[Callable[..., Any], Tuple[str]], Any],
+]:
+ values: Dict[str, Any] = {}
+ errors: List[ErrorWrapper] = []
+ response = response or Response(
+ content=None,
+ status_code=None, # type: ignore
+ headers=None, # type: ignore # in Starlette
+ media_type=None, # type: ignore # in Starlette
+ background=None, # type: ignore # in Starlette
+ )
+ dependency_cache = dependency_cache or {}
+ sub_dependant: Dependant
+ for sub_dependant in dependant.dependencies:
+ sub_dependant.call = cast(Callable[..., Any], sub_dependant.call)
+ sub_dependant.cache_key = cast(
+ Tuple[Callable[..., Any], Tuple[str]], sub_dependant.cache_key
+ )
+ call = sub_dependant.call
+ use_sub_dependant = sub_dependant
+ if (
+ dependency_overrides_provider
+ and dependency_overrides_provider.dependency_overrides
+ ):
+ original_call = sub_dependant.call
+ call = getattr(
+ dependency_overrides_provider, "dependency_overrides", {}
+ ).get(original_call, original_call)
+ use_path: str = sub_dependant.path # type: ignore
+ use_sub_dependant = get_dependant(
+ path=use_path,
+ call=call,
+ name=sub_dependant.name,
+ security_scopes=sub_dependant.security_scopes,
+ )
+ use_sub_dependant.security_scopes = sub_dependant.security_scopes
+
+ solved_result = await solve_dependencies(
+ request=request,
+ dependant=use_sub_dependant,
+ body=body,
+ background_tasks=background_tasks,
+ response=response,
+ dependency_overrides_provider=dependency_overrides_provider,
+ dependency_cache=dependency_cache,
+ )
+ (
+ sub_values,
+ sub_errors,
+ background_tasks,
+ _, # the subdependency returns the same response we have
+ sub_dependency_cache,
+ ) = solved_result
+ dependency_cache.update(sub_dependency_cache)
+ if sub_errors:
+ errors.extend(sub_errors)
+ continue
+ if sub_dependant.use_cache and sub_dependant.cache_key in dependency_cache:
+ solved = dependency_cache[sub_dependant.cache_key]
+ elif is_gen_callable(call) or is_async_gen_callable(call):
+ stack = request.scope.get("fastapi_astack")
+ if stack is None:
+ raise RuntimeError(
+ async_contextmanager_dependencies_error
+ ) # pragma: no cover
+ solved = await solve_generator(
+ call=call, stack=stack, sub_values=sub_values
+ )
+ elif is_coroutine_callable(call):
+ solved = await call(**sub_values)
+ else:
+ solved = await run_in_threadpool(call, **sub_values)
+ if sub_dependant.name is not None:
+ values[sub_dependant.name] = solved
+ if sub_dependant.cache_key not in dependency_cache:
+ dependency_cache[sub_dependant.cache_key] = solved
+ path_values, path_errors = request_params_to_args(
+ dependant.path_params, request.path_params
+ )
+ query_values, query_errors = request_params_to_args(
+ dependant.query_params, request.query_params
+ )
+ header_values, header_errors = request_params_to_args(
+ dependant.header_params, request.headers
+ )
+ cookie_values, cookie_errors = request_params_to_args(
+ dependant.cookie_params, request.cookies
+ )
+ values.update(path_values)
+ values.update(query_values)
+ values.update(header_values)
+ values.update(cookie_values)
+ errors += path_errors + query_errors + header_errors + cookie_errors
+ if dependant.body_params:
+ (
+ body_values,
+ body_errors,
+ ) = await request_body_to_args( # body_params checked above
+ required_params=dependant.body_params, received_body=body
+ )
+ values.update(body_values)
+ errors.extend(body_errors)
+ if dependant.http_connection_param_name:
+ values[dependant.http_connection_param_name] = request
+ if dependant.request_param_name and isinstance(request, Request):
+ values[dependant.request_param_name] = request
+ elif dependant.websocket_param_name and isinstance(request, WebSocket):
+ values[dependant.websocket_param_name] = request
+ if dependant.background_tasks_param_name:
+ if background_tasks is None:
+ background_tasks = BackgroundTasks()
+ values[dependant.background_tasks_param_name] = background_tasks
+ if dependant.response_param_name:
+ values[dependant.response_param_name] = response
+ if dependant.security_scopes_param_name:
+ values[dependant.security_scopes_param_name] = SecurityScopes(
+ scopes=dependant.security_scopes
+ )
+ return values, errors, background_tasks, response, dependency_cache
+
+
+def request_params_to_args(
+ required_params: Sequence[ModelField],
+ received_params: Union[Mapping[str, Any], QueryParams, Headers],
+) -> Tuple[Dict[str, Any], List[ErrorWrapper]]:
+ values = {}
+ errors = []
+ for field in required_params:
+ if is_scalar_sequence_field(field) and isinstance(
+ received_params, (QueryParams, Headers)
+ ):
+ value = received_params.getlist(field.alias) or field.default
+ else:
+ value = received_params.get(field.alias)
+ field_info = field.field_info
+ assert isinstance(
+ field_info, params.Param
+ ), "Params must be subclasses of Param"
+ if value is None:
+ if field.required:
+ errors.append(
+ ErrorWrapper(
+ MissingError(), loc=(field_info.in_.value, field.alias)
+ )
+ )
+ else:
+ values[field.name] = deepcopy(field.default)
+ continue
+ v_, errors_ = field.validate(
+ value, values, loc=(field_info.in_.value, field.alias)
+ )
+ if isinstance(errors_, ErrorWrapper):
+ errors.append(errors_)
+ elif isinstance(errors_, list):
+ errors.extend(errors_)
+ else:
+ values[field.name] = v_
+ return values, errors
+
+
+async def request_body_to_args(
+ required_params: List[ModelField],
+ received_body: Optional[Union[Dict[str, Any], FormData]],
+) -> Tuple[Dict[str, Any], List[ErrorWrapper]]:
+ values = {}
+ errors = []
+ if required_params:
+ field = required_params[0]
+ field_info = field.field_info
+ embed = getattr(field_info, "embed", None)
+ field_alias_omitted = len(required_params) == 1 and not embed
+ if field_alias_omitted:
+ received_body = {field.alias: received_body}
+
+ for field in required_params:
+ loc: Tuple[str, ...]
+ if field_alias_omitted:
+ loc = ("body",)
+ else:
+ loc = ("body", field.alias)
+
+ value: Optional[Any] = None
+ if received_body is not None:
+ if (
+ field.shape in sequence_shapes or field.type_ in sequence_types
+ ) and isinstance(received_body, FormData):
+ value = received_body.getlist(field.alias)
+ else:
+ try:
+ value = received_body.get(field.alias)
+ except AttributeError:
+ errors.append(get_missing_field_error(loc))
+ continue
+ if (
+ value is None
+ or (isinstance(field_info, params.Form) and value == "")
+ or (
+ isinstance(field_info, params.Form)
+ and field.shape in sequence_shapes
+ and len(value) == 0
+ )
+ ):
+ if field.required:
+ errors.append(get_missing_field_error(loc))
+ else:
+ values[field.name] = deepcopy(field.default)
+ continue
+ if (
+ isinstance(field_info, params.File)
+ and lenient_issubclass(field.type_, bytes)
+ and isinstance(value, UploadFile)
+ ):
+ value = await value.read()
+ elif (
+ field.shape in sequence_shapes
+ and isinstance(field_info, params.File)
+ and lenient_issubclass(field.type_, bytes)
+ and isinstance(value, sequence_types)
+ ):
+ awaitables = [sub_value.read() for sub_value in value]
+ contents = await asyncio.gather(*awaitables)
+ value = sequence_shape_to_type[field.shape](contents)
+
+ v_, errors_ = field.validate(value, values, loc=loc)
+
+ if isinstance(errors_, ErrorWrapper):
+ errors.append(errors_)
+ elif isinstance(errors_, list):
+ errors.extend(errors_)
+ else:
+ values[field.name] = v_
+ return values, errors
+
+
+def get_missing_field_error(loc: Tuple[str, ...]) -> ErrorWrapper:
+ missing_field_error = ErrorWrapper(MissingError(), loc=loc)
+ return missing_field_error
+
+
+def get_schema_compatible_field(*, field: ModelField) -> ModelField:
+ out_field = field
+ if lenient_issubclass(field.type_, UploadFile):
+ use_type: type = bytes
+ if field.shape in sequence_shapes:
+ use_type = List[bytes]
+ out_field = create_response_field(
+ name=field.name,
+ type_=use_type,
+ class_validators=field.class_validators,
+ model_config=field.model_config,
+ default=field.default,
+ required=field.required,
+ alias=field.alias,
+ field_info=field.field_info,
+ )
+ return out_field
+
+
+def get_body_field(*, dependant: Dependant, name: str) -> Optional[ModelField]:
+ flat_dependant = get_flat_dependant(dependant)
+ if not flat_dependant.body_params:
+ return None
+ first_param = flat_dependant.body_params[0]
+ field_info = first_param.field_info
+ embed = getattr(field_info, "embed", None)
+ body_param_names_set = {param.name for param in flat_dependant.body_params}
+ if len(body_param_names_set) == 1 and not embed:
+ final_field = get_schema_compatible_field(field=first_param)
+ check_file_field(final_field)
+ return final_field
+ # If one field requires to embed, all have to be embedded
+ # in case a sub-dependency is evaluated with a single unique body field
+ # That is combined (embedded) with other body fields
+ for param in flat_dependant.body_params:
+ setattr(param.field_info, "embed", True)
+ model_name = "Body_" + name
+ BodyModel = create_model(model_name)
+ for f in flat_dependant.body_params:
+ BodyModel.__fields__[f.name] = get_schema_compatible_field(field=f)
+ required = any(True for f in flat_dependant.body_params if f.required)
+
+ BodyFieldInfo_kwargs: Dict[str, Any] = dict(default=None)
+ if any(isinstance(f.field_info, params.File) for f in flat_dependant.body_params):
+ BodyFieldInfo: Type[params.Body] = params.File
+ elif any(isinstance(f.field_info, params.Form) for f in flat_dependant.body_params):
+ BodyFieldInfo = params.Form
+ else:
+ BodyFieldInfo = params.Body
+
+ body_param_media_types = [
+ getattr(f.field_info, "media_type")
+ for f in flat_dependant.body_params
+ if isinstance(f.field_info, params.Body)
+ ]
+ if len(set(body_param_media_types)) == 1:
+ BodyFieldInfo_kwargs["media_type"] = body_param_media_types[0]
+ final_field = create_response_field(
+ name="body",
+ type_=BodyModel,
+ required=required,
+ alias="body",
+ field_info=BodyFieldInfo(**BodyFieldInfo_kwargs),
+ )
+ check_file_field(final_field)
+ return final_field
diff --git a/.venv/lib/python3.9/site-packages/fastapi/encoders.py b/.venv/lib/python3.9/site-packages/fastapi/encoders.py
new file mode 100644
index 0000000..6a2a75d
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/encoders.py
@@ -0,0 +1,150 @@
+from collections import defaultdict
+from enum import Enum
+from pathlib import PurePath
+from types import GeneratorType
+from typing import Any, Callable, Dict, List, Optional, Set, Tuple, Union
+
+from pydantic import BaseModel
+from pydantic.json import ENCODERS_BY_TYPE
+
+SetIntStr = Set[Union[int, str]]
+DictIntStrAny = Dict[Union[int, str], Any]
+
+
+def generate_encoders_by_class_tuples(
+ type_encoder_map: Dict[Any, Callable[[Any], Any]]
+) -> Dict[Callable[[Any], Any], Tuple[Any, ...]]:
+ encoders_by_class_tuples: Dict[Callable[[Any], Any], Tuple[Any, ...]] = defaultdict(
+ tuple
+ )
+ for type_, encoder in type_encoder_map.items():
+ encoders_by_class_tuples[encoder] += (type_,)
+ return encoders_by_class_tuples
+
+
+encoders_by_class_tuples = generate_encoders_by_class_tuples(ENCODERS_BY_TYPE)
+
+
+def jsonable_encoder(
+ obj: Any,
+ include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ by_alias: bool = True,
+ exclude_unset: bool = False,
+ exclude_defaults: bool = False,
+ exclude_none: bool = False,
+ custom_encoder: Dict[Any, Callable[[Any], Any]] = {},
+ sqlalchemy_safe: bool = True,
+) -> Any:
+ if include is not None and not isinstance(include, set):
+ include = set(include)
+ if exclude is not None and not isinstance(exclude, set):
+ exclude = set(exclude)
+ if isinstance(obj, BaseModel):
+ encoder = getattr(obj.__config__, "json_encoders", {})
+ if custom_encoder:
+ encoder.update(custom_encoder)
+ obj_dict = obj.dict(
+ include=include, # type: ignore # in Pydantic
+ exclude=exclude, # type: ignore # in Pydantic
+ by_alias=by_alias,
+ exclude_unset=exclude_unset,
+ exclude_none=exclude_none,
+ exclude_defaults=exclude_defaults,
+ )
+ if "__root__" in obj_dict:
+ obj_dict = obj_dict["__root__"]
+ return jsonable_encoder(
+ obj_dict,
+ exclude_none=exclude_none,
+ exclude_defaults=exclude_defaults,
+ custom_encoder=encoder,
+ sqlalchemy_safe=sqlalchemy_safe,
+ )
+ if isinstance(obj, Enum):
+ return obj.value
+ if isinstance(obj, PurePath):
+ return str(obj)
+ if isinstance(obj, (str, int, float, type(None))):
+ return obj
+ if isinstance(obj, dict):
+ encoded_dict = {}
+ for key, value in obj.items():
+ if (
+ (
+ not sqlalchemy_safe
+ or (not isinstance(key, str))
+ or (not key.startswith("_sa"))
+ )
+ and (value is not None or not exclude_none)
+ and ((include and key in include) or not exclude or key not in exclude)
+ ):
+ encoded_key = jsonable_encoder(
+ key,
+ by_alias=by_alias,
+ exclude_unset=exclude_unset,
+ exclude_none=exclude_none,
+ custom_encoder=custom_encoder,
+ sqlalchemy_safe=sqlalchemy_safe,
+ )
+ encoded_value = jsonable_encoder(
+ value,
+ by_alias=by_alias,
+ exclude_unset=exclude_unset,
+ exclude_none=exclude_none,
+ custom_encoder=custom_encoder,
+ sqlalchemy_safe=sqlalchemy_safe,
+ )
+ encoded_dict[encoded_key] = encoded_value
+ return encoded_dict
+ if isinstance(obj, (list, set, frozenset, GeneratorType, tuple)):
+ encoded_list = []
+ for item in obj:
+ encoded_list.append(
+ jsonable_encoder(
+ item,
+ include=include,
+ exclude=exclude,
+ by_alias=by_alias,
+ exclude_unset=exclude_unset,
+ exclude_defaults=exclude_defaults,
+ exclude_none=exclude_none,
+ custom_encoder=custom_encoder,
+ sqlalchemy_safe=sqlalchemy_safe,
+ )
+ )
+ return encoded_list
+
+ if custom_encoder:
+ if type(obj) in custom_encoder:
+ return custom_encoder[type(obj)](obj)
+ else:
+ for encoder_type, encoder in custom_encoder.items():
+ if isinstance(obj, encoder_type):
+ return encoder(obj)
+
+ if type(obj) in ENCODERS_BY_TYPE:
+ return ENCODERS_BY_TYPE[type(obj)](obj)
+ for encoder, classes_tuple in encoders_by_class_tuples.items():
+ if isinstance(obj, classes_tuple):
+ return encoder(obj)
+
+ errors: List[Exception] = []
+ try:
+ data = dict(obj)
+ except Exception as e:
+ errors.append(e)
+ try:
+ data = vars(obj)
+ except Exception as e:
+ errors.append(e)
+ raise ValueError(errors)
+ return jsonable_encoder(
+ data,
+ by_alias=by_alias,
+ exclude_unset=exclude_unset,
+ exclude_defaults=exclude_defaults,
+ exclude_none=exclude_none,
+ custom_encoder=custom_encoder,
+ sqlalchemy_safe=sqlalchemy_safe,
+ )
diff --git a/.venv/lib/python3.9/site-packages/fastapi/exception_handlers.py b/.venv/lib/python3.9/site-packages/fastapi/exception_handlers.py
new file mode 100644
index 0000000..2b286d7
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/exception_handlers.py
@@ -0,0 +1,25 @@
+from fastapi.encoders import jsonable_encoder
+from fastapi.exceptions import RequestValidationError
+from starlette.exceptions import HTTPException
+from starlette.requests import Request
+from starlette.responses import JSONResponse
+from starlette.status import HTTP_422_UNPROCESSABLE_ENTITY
+
+
+async def http_exception_handler(request: Request, exc: HTTPException) -> JSONResponse:
+ headers = getattr(exc, "headers", None)
+ if headers:
+ return JSONResponse(
+ {"detail": exc.detail}, status_code=exc.status_code, headers=headers
+ )
+ else:
+ return JSONResponse({"detail": exc.detail}, status_code=exc.status_code)
+
+
+async def request_validation_exception_handler(
+ request: Request, exc: RequestValidationError
+) -> JSONResponse:
+ return JSONResponse(
+ status_code=HTTP_422_UNPROCESSABLE_ENTITY,
+ content={"detail": jsonable_encoder(exc.errors())},
+ )
diff --git a/.venv/lib/python3.9/site-packages/fastapi/exceptions.py b/.venv/lib/python3.9/site-packages/fastapi/exceptions.py
new file mode 100644
index 0000000..8d92311
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/exceptions.py
@@ -0,0 +1,37 @@
+from typing import Any, Dict, Optional, Sequence
+
+from pydantic import ValidationError, create_model
+from pydantic.error_wrappers import ErrorList
+from starlette.exceptions import HTTPException as StarletteHTTPException
+
+
+class HTTPException(StarletteHTTPException):
+ def __init__(
+ self,
+ status_code: int,
+ detail: Any = None,
+ headers: Optional[Dict[str, Any]] = None,
+ ) -> None:
+ super().__init__(status_code=status_code, detail=detail)
+ self.headers = headers
+
+
+RequestErrorModel = create_model("Request")
+WebSocketErrorModel = create_model("WebSocket")
+
+
+class FastAPIError(RuntimeError):
+ """
+ A generic, FastAPI-specific error.
+ """
+
+
+class RequestValidationError(ValidationError):
+ def __init__(self, errors: Sequence[ErrorList], *, body: Any = None) -> None:
+ self.body = body
+ super().__init__(errors, RequestErrorModel)
+
+
+class WebSocketRequestValidationError(ValidationError):
+ def __init__(self, errors: Sequence[ErrorList]) -> None:
+ super().__init__(errors, WebSocketErrorModel)
diff --git a/.venv/lib/python3.9/site-packages/fastapi/logger.py b/.venv/lib/python3.9/site-packages/fastapi/logger.py
new file mode 100644
index 0000000..5b2c4ad
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/logger.py
@@ -0,0 +1,3 @@
+import logging
+
+logger = logging.getLogger("fastapi")
diff --git a/.venv/lib/python3.9/site-packages/fastapi/middleware/__init__.py b/.venv/lib/python3.9/site-packages/fastapi/middleware/__init__.py
new file mode 100644
index 0000000..620296d
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/middleware/__init__.py
@@ -0,0 +1 @@
+from starlette.middleware import Middleware as Middleware
diff --git a/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..baaf797
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/cors.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/cors.cpython-39.pyc
new file mode 100644
index 0000000..822da47
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/cors.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/gzip.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/gzip.cpython-39.pyc
new file mode 100644
index 0000000..77c7338
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/gzip.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/httpsredirect.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/httpsredirect.cpython-39.pyc
new file mode 100644
index 0000000..2074b10
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/httpsredirect.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/trustedhost.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/trustedhost.cpython-39.pyc
new file mode 100644
index 0000000..6c7e8c7
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/trustedhost.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/wsgi.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/wsgi.cpython-39.pyc
new file mode 100644
index 0000000..bb4910a
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/middleware/__pycache__/wsgi.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/middleware/cors.py b/.venv/lib/python3.9/site-packages/fastapi/middleware/cors.py
new file mode 100644
index 0000000..8dfaad0
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/middleware/cors.py
@@ -0,0 +1 @@
+from starlette.middleware.cors import CORSMiddleware as CORSMiddleware # noqa
diff --git a/.venv/lib/python3.9/site-packages/fastapi/middleware/gzip.py b/.venv/lib/python3.9/site-packages/fastapi/middleware/gzip.py
new file mode 100644
index 0000000..bbeb2cc
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/middleware/gzip.py
@@ -0,0 +1 @@
+from starlette.middleware.gzip import GZipMiddleware as GZipMiddleware # noqa
diff --git a/.venv/lib/python3.9/site-packages/fastapi/middleware/httpsredirect.py b/.venv/lib/python3.9/site-packages/fastapi/middleware/httpsredirect.py
new file mode 100644
index 0000000..b7a3d8e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/middleware/httpsredirect.py
@@ -0,0 +1,3 @@
+from starlette.middleware.httpsredirect import ( # noqa
+ HTTPSRedirectMiddleware as HTTPSRedirectMiddleware,
+)
diff --git a/.venv/lib/python3.9/site-packages/fastapi/middleware/trustedhost.py b/.venv/lib/python3.9/site-packages/fastapi/middleware/trustedhost.py
new file mode 100644
index 0000000..08d7e03
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/middleware/trustedhost.py
@@ -0,0 +1,3 @@
+from starlette.middleware.trustedhost import ( # noqa
+ TrustedHostMiddleware as TrustedHostMiddleware,
+)
diff --git a/.venv/lib/python3.9/site-packages/fastapi/middleware/wsgi.py b/.venv/lib/python3.9/site-packages/fastapi/middleware/wsgi.py
new file mode 100644
index 0000000..c4c6a79
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/middleware/wsgi.py
@@ -0,0 +1 @@
+from starlette.middleware.wsgi import WSGIMiddleware as WSGIMiddleware # noqa
diff --git a/.venv/lib/python3.9/site-packages/fastapi/openapi/__init__.py b/.venv/lib/python3.9/site-packages/fastapi/openapi/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/.venv/lib/python3.9/site-packages/fastapi/openapi/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/openapi/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..1c28424
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/openapi/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/openapi/__pycache__/constants.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/openapi/__pycache__/constants.cpython-39.pyc
new file mode 100644
index 0000000..bf5570f
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/openapi/__pycache__/constants.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/openapi/__pycache__/docs.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/openapi/__pycache__/docs.cpython-39.pyc
new file mode 100644
index 0000000..5a46ebf
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/openapi/__pycache__/docs.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/openapi/__pycache__/models.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/openapi/__pycache__/models.cpython-39.pyc
new file mode 100644
index 0000000..ec960d9
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/openapi/__pycache__/models.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/openapi/__pycache__/utils.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/openapi/__pycache__/utils.cpython-39.pyc
new file mode 100644
index 0000000..6c33a8e
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/openapi/__pycache__/utils.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/openapi/constants.py b/.venv/lib/python3.9/site-packages/fastapi/openapi/constants.py
new file mode 100644
index 0000000..3e69e55
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/openapi/constants.py
@@ -0,0 +1,3 @@
+METHODS_WITH_BODY = {"GET", "HEAD", "POST", "PUT", "DELETE", "PATCH"}
+STATUS_CODES_WITH_NO_BODY = {100, 101, 102, 103, 204, 304}
+REF_PREFIX = "#/components/schemas/"
diff --git a/.venv/lib/python3.9/site-packages/fastapi/openapi/docs.py b/.venv/lib/python3.9/site-packages/fastapi/openapi/docs.py
new file mode 100644
index 0000000..fd22e4e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/openapi/docs.py
@@ -0,0 +1,177 @@
+import json
+from typing import Any, Dict, Optional
+
+from fastapi.encoders import jsonable_encoder
+from starlette.responses import HTMLResponse
+
+
+def get_swagger_ui_html(
+ *,
+ openapi_url: str,
+ title: str,
+ swagger_js_url: str = "https://cdn.jsdelivr.net/npm/swagger-ui-dist@3/swagger-ui-bundle.js",
+ swagger_css_url: str = "https://cdn.jsdelivr.net/npm/swagger-ui-dist@3/swagger-ui.css",
+ swagger_favicon_url: str = "https://fastapi.tiangolo.com/img/favicon.png",
+ oauth2_redirect_url: Optional[str] = None,
+ init_oauth: Optional[Dict[str, Any]] = None,
+) -> HTMLResponse:
+
+ html = f"""
+
+
+
+
+
+ {title}
+
+
+
+
+
+
+
+
+
+ """
+ return HTMLResponse(html)
+
+
+def get_redoc_html(
+ *,
+ openapi_url: str,
+ title: str,
+ redoc_js_url: str = "https://cdn.jsdelivr.net/npm/redoc@next/bundles/redoc.standalone.js",
+ redoc_favicon_url: str = "https://fastapi.tiangolo.com/img/favicon.png",
+ with_google_fonts: bool = True,
+) -> HTMLResponse:
+ html = f"""
+
+
+
+ {title}
+
+
+
+ """
+ if with_google_fonts:
+ html += """
+
+ """
+ html += f"""
+
+
+
+
+
+
+
+
+
+ """
+ return HTMLResponse(html)
+
+
+def get_swagger_ui_oauth2_redirect_html() -> HTMLResponse:
+ html = """
+
+
+
+
+
+
+ """
+ return HTMLResponse(content=html)
diff --git a/.venv/lib/python3.9/site-packages/fastapi/openapi/models.py b/.venv/lib/python3.9/site-packages/fastapi/openapi/models.py
new file mode 100644
index 0000000..fd48094
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/openapi/models.py
@@ -0,0 +1,351 @@
+from enum import Enum
+from typing import Any, Callable, Dict, Iterable, List, Optional, Union
+
+from fastapi.logger import logger
+from pydantic import AnyUrl, BaseModel, Field
+
+try:
+ import email_validator # type: ignore
+
+ assert email_validator # make autoflake ignore the unused import
+ from pydantic import EmailStr
+except ImportError: # pragma: no cover
+
+ class EmailStr(str): # type: ignore
+ @classmethod
+ def __get_validators__(cls) -> Iterable[Callable[..., Any]]:
+ yield cls.validate
+
+ @classmethod
+ def validate(cls, v: Any) -> str:
+ logger.warning(
+ "email-validator not installed, email fields will be treated as str.\n"
+ "To install, run: pip install email-validator"
+ )
+ return str(v)
+
+
+class Contact(BaseModel):
+ name: Optional[str] = None
+ url: Optional[AnyUrl] = None
+ email: Optional[EmailStr] = None
+
+
+class License(BaseModel):
+ name: str
+ url: Optional[AnyUrl] = None
+
+
+class Info(BaseModel):
+ title: str
+ description: Optional[str] = None
+ termsOfService: Optional[str] = None
+ contact: Optional[Contact] = None
+ license: Optional[License] = None
+ version: str
+
+
+class ServerVariable(BaseModel):
+ enum: Optional[List[str]] = None
+ default: str
+ description: Optional[str] = None
+
+
+class Server(BaseModel):
+ url: Union[AnyUrl, str]
+ description: Optional[str] = None
+ variables: Optional[Dict[str, ServerVariable]] = None
+
+
+class Reference(BaseModel):
+ ref: str = Field(..., alias="$ref")
+
+
+class Discriminator(BaseModel):
+ propertyName: str
+ mapping: Optional[Dict[str, str]] = None
+
+
+class XML(BaseModel):
+ name: Optional[str] = None
+ namespace: Optional[str] = None
+ prefix: Optional[str] = None
+ attribute: Optional[bool] = None
+ wrapped: Optional[bool] = None
+
+
+class ExternalDocumentation(BaseModel):
+ description: Optional[str] = None
+ url: AnyUrl
+
+
+class SchemaBase(BaseModel):
+ ref: Optional[str] = Field(None, alias="$ref")
+ title: Optional[str] = None
+ multipleOf: Optional[float] = None
+ maximum: Optional[float] = None
+ exclusiveMaximum: Optional[float] = None
+ minimum: Optional[float] = None
+ exclusiveMinimum: Optional[float] = None
+ maxLength: Optional[int] = Field(None, gte=0)
+ minLength: Optional[int] = Field(None, gte=0)
+ pattern: Optional[str] = None
+ maxItems: Optional[int] = Field(None, gte=0)
+ minItems: Optional[int] = Field(None, gte=0)
+ uniqueItems: Optional[bool] = None
+ maxProperties: Optional[int] = Field(None, gte=0)
+ minProperties: Optional[int] = Field(None, gte=0)
+ required: Optional[List[str]] = None
+ enum: Optional[List[Any]] = None
+ type: Optional[str] = None
+ allOf: Optional[List[Any]] = None
+ oneOf: Optional[List[Any]] = None
+ anyOf: Optional[List[Any]] = None
+ not_: Optional[Any] = Field(None, alias="not")
+ items: Optional[Any] = None
+ properties: Optional[Dict[str, Any]] = None
+ additionalProperties: Optional[Union[Dict[str, Any], bool]] = None
+ description: Optional[str] = None
+ format: Optional[str] = None
+ default: Optional[Any] = None
+ nullable: Optional[bool] = None
+ discriminator: Optional[Discriminator] = None
+ readOnly: Optional[bool] = None
+ writeOnly: Optional[bool] = None
+ xml: Optional[XML] = None
+ externalDocs: Optional[ExternalDocumentation] = None
+ example: Optional[Any] = None
+ deprecated: Optional[bool] = None
+
+
+class Schema(SchemaBase):
+ allOf: Optional[List[SchemaBase]] = None
+ oneOf: Optional[List[SchemaBase]] = None
+ anyOf: Optional[List[SchemaBase]] = None
+ not_: Optional[SchemaBase] = Field(None, alias="not")
+ items: Optional[SchemaBase] = None
+ properties: Optional[Dict[str, SchemaBase]] = None
+ additionalProperties: Optional[Union[Dict[str, Any], bool]] = None
+
+
+class Example(BaseModel):
+ summary: Optional[str] = None
+ description: Optional[str] = None
+ value: Optional[Any] = None
+ externalValue: Optional[AnyUrl] = None
+
+
+class ParameterInType(Enum):
+ query = "query"
+ header = "header"
+ path = "path"
+ cookie = "cookie"
+
+
+class Encoding(BaseModel):
+ contentType: Optional[str] = None
+ # Workaround OpenAPI recursive reference, using Any
+ headers: Optional[Dict[str, Union[Any, Reference]]] = None
+ style: Optional[str] = None
+ explode: Optional[bool] = None
+ allowReserved: Optional[bool] = None
+
+
+class MediaType(BaseModel):
+ schema_: Optional[Union[Schema, Reference]] = Field(None, alias="schema")
+ example: Optional[Any] = None
+ examples: Optional[Dict[str, Union[Example, Reference]]] = None
+ encoding: Optional[Dict[str, Encoding]] = None
+
+
+class ParameterBase(BaseModel):
+ description: Optional[str] = None
+ required: Optional[bool] = None
+ deprecated: Optional[bool] = None
+ # Serialization rules for simple scenarios
+ style: Optional[str] = None
+ explode: Optional[bool] = None
+ allowReserved: Optional[bool] = None
+ schema_: Optional[Union[Schema, Reference]] = Field(None, alias="schema")
+ example: Optional[Any] = None
+ examples: Optional[Dict[str, Union[Example, Reference]]] = None
+ # Serialization rules for more complex scenarios
+ content: Optional[Dict[str, MediaType]] = None
+
+
+class Parameter(ParameterBase):
+ name: str
+ in_: ParameterInType = Field(..., alias="in")
+
+
+class Header(ParameterBase):
+ pass
+
+
+# Workaround OpenAPI recursive reference
+class EncodingWithHeaders(Encoding):
+ headers: Optional[Dict[str, Union[Header, Reference]]] = None
+
+
+class RequestBody(BaseModel):
+ description: Optional[str] = None
+ content: Dict[str, MediaType]
+ required: Optional[bool] = None
+
+
+class Link(BaseModel):
+ operationRef: Optional[str] = None
+ operationId: Optional[str] = None
+ parameters: Optional[Dict[str, Union[Any, str]]] = None
+ requestBody: Optional[Union[Any, str]] = None
+ description: Optional[str] = None
+ server: Optional[Server] = None
+
+
+class Response(BaseModel):
+ description: str
+ headers: Optional[Dict[str, Union[Header, Reference]]] = None
+ content: Optional[Dict[str, MediaType]] = None
+ links: Optional[Dict[str, Union[Link, Reference]]] = None
+
+
+class Operation(BaseModel):
+ tags: Optional[List[str]] = None
+ summary: Optional[str] = None
+ description: Optional[str] = None
+ externalDocs: Optional[ExternalDocumentation] = None
+ operationId: Optional[str] = None
+ parameters: Optional[List[Union[Parameter, Reference]]] = None
+ requestBody: Optional[Union[RequestBody, Reference]] = None
+ responses: Dict[str, Response]
+ # Workaround OpenAPI recursive reference
+ callbacks: Optional[Dict[str, Union[Dict[str, Any], Reference]]] = None
+ deprecated: Optional[bool] = None
+ security: Optional[List[Dict[str, List[str]]]] = None
+ servers: Optional[List[Server]] = None
+
+
+class PathItem(BaseModel):
+ ref: Optional[str] = Field(None, alias="$ref")
+ summary: Optional[str] = None
+ description: Optional[str] = None
+ get: Optional[Operation] = None
+ put: Optional[Operation] = None
+ post: Optional[Operation] = None
+ delete: Optional[Operation] = None
+ options: Optional[Operation] = None
+ head: Optional[Operation] = None
+ patch: Optional[Operation] = None
+ trace: Optional[Operation] = None
+ servers: Optional[List[Server]] = None
+ parameters: Optional[List[Union[Parameter, Reference]]] = None
+
+
+# Workaround OpenAPI recursive reference
+class OperationWithCallbacks(BaseModel):
+ callbacks: Optional[Dict[str, Union[Dict[str, PathItem], Reference]]] = None
+
+
+class SecuritySchemeType(Enum):
+ apiKey = "apiKey"
+ http = "http"
+ oauth2 = "oauth2"
+ openIdConnect = "openIdConnect"
+
+
+class SecurityBase(BaseModel):
+ type_: SecuritySchemeType = Field(..., alias="type")
+ description: Optional[str] = None
+
+
+class APIKeyIn(Enum):
+ query = "query"
+ header = "header"
+ cookie = "cookie"
+
+
+class APIKey(SecurityBase):
+ type_ = Field(SecuritySchemeType.apiKey, alias="type")
+ in_: APIKeyIn = Field(..., alias="in")
+ name: str
+
+
+class HTTPBase(SecurityBase):
+ type_ = Field(SecuritySchemeType.http, alias="type")
+ scheme: str
+
+
+class HTTPBearer(HTTPBase):
+ scheme = "bearer"
+ bearerFormat: Optional[str] = None
+
+
+class OAuthFlow(BaseModel):
+ refreshUrl: Optional[str] = None
+ scopes: Dict[str, str] = {}
+
+
+class OAuthFlowImplicit(OAuthFlow):
+ authorizationUrl: str
+
+
+class OAuthFlowPassword(OAuthFlow):
+ tokenUrl: str
+
+
+class OAuthFlowClientCredentials(OAuthFlow):
+ tokenUrl: str
+
+
+class OAuthFlowAuthorizationCode(OAuthFlow):
+ authorizationUrl: str
+ tokenUrl: str
+
+
+class OAuthFlows(BaseModel):
+ implicit: Optional[OAuthFlowImplicit] = None
+ password: Optional[OAuthFlowPassword] = None
+ clientCredentials: Optional[OAuthFlowClientCredentials] = None
+ authorizationCode: Optional[OAuthFlowAuthorizationCode] = None
+
+
+class OAuth2(SecurityBase):
+ type_ = Field(SecuritySchemeType.oauth2, alias="type")
+ flows: OAuthFlows
+
+
+class OpenIdConnect(SecurityBase):
+ type_ = Field(SecuritySchemeType.openIdConnect, alias="type")
+ openIdConnectUrl: str
+
+
+SecurityScheme = Union[APIKey, HTTPBase, OAuth2, OpenIdConnect, HTTPBearer]
+
+
+class Components(BaseModel):
+ schemas: Optional[Dict[str, Union[Schema, Reference]]] = None
+ responses: Optional[Dict[str, Union[Response, Reference]]] = None
+ parameters: Optional[Dict[str, Union[Parameter, Reference]]] = None
+ examples: Optional[Dict[str, Union[Example, Reference]]] = None
+ requestBodies: Optional[Dict[str, Union[RequestBody, Reference]]] = None
+ headers: Optional[Dict[str, Union[Header, Reference]]] = None
+ securitySchemes: Optional[Dict[str, Union[SecurityScheme, Reference]]] = None
+ links: Optional[Dict[str, Union[Link, Reference]]] = None
+ callbacks: Optional[Dict[str, Union[Dict[str, PathItem], Reference]]] = None
+
+
+class Tag(BaseModel):
+ name: str
+ description: Optional[str] = None
+ externalDocs: Optional[ExternalDocumentation] = None
+
+
+class OpenAPI(BaseModel):
+ openapi: str
+ info: Info
+ servers: Optional[List[Server]] = None
+ paths: Dict[str, PathItem]
+ components: Optional[Components] = None
+ security: Optional[List[Dict[str, List[str]]]] = None
+ tags: Optional[List[Tag]] = None
+ externalDocs: Optional[ExternalDocumentation] = None
diff --git a/.venv/lib/python3.9/site-packages/fastapi/openapi/utils.py b/.venv/lib/python3.9/site-packages/fastapi/openapi/utils.py
new file mode 100644
index 0000000..410ba93
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/openapi/utils.py
@@ -0,0 +1,377 @@
+import http.client
+from enum import Enum
+from typing import Any, Dict, List, Optional, Sequence, Set, Tuple, Type, Union, cast
+
+from fastapi import routing
+from fastapi.datastructures import DefaultPlaceholder
+from fastapi.dependencies.models import Dependant
+from fastapi.dependencies.utils import get_flat_dependant, get_flat_params
+from fastapi.encoders import jsonable_encoder
+from fastapi.openapi.constants import (
+ METHODS_WITH_BODY,
+ REF_PREFIX,
+ STATUS_CODES_WITH_NO_BODY,
+)
+from fastapi.openapi.models import OpenAPI
+from fastapi.params import Body, Param
+from fastapi.responses import Response
+from fastapi.utils import (
+ deep_dict_update,
+ generate_operation_id_for_path,
+ get_model_definitions,
+)
+from pydantic import BaseModel
+from pydantic.fields import ModelField
+from pydantic.schema import (
+ field_schema,
+ get_flat_models_from_fields,
+ get_model_name_map,
+)
+from pydantic.utils import lenient_issubclass
+from starlette.responses import JSONResponse
+from starlette.routing import BaseRoute
+from starlette.status import HTTP_422_UNPROCESSABLE_ENTITY
+
+validation_error_definition = {
+ "title": "ValidationError",
+ "type": "object",
+ "properties": {
+ "loc": {"title": "Location", "type": "array", "items": {"type": "string"}},
+ "msg": {"title": "Message", "type": "string"},
+ "type": {"title": "Error Type", "type": "string"},
+ },
+ "required": ["loc", "msg", "type"],
+}
+
+validation_error_response_definition = {
+ "title": "HTTPValidationError",
+ "type": "object",
+ "properties": {
+ "detail": {
+ "title": "Detail",
+ "type": "array",
+ "items": {"$ref": REF_PREFIX + "ValidationError"},
+ }
+ },
+}
+
+status_code_ranges: Dict[str, str] = {
+ "1XX": "Information",
+ "2XX": "Success",
+ "3XX": "Redirection",
+ "4XX": "Client Error",
+ "5XX": "Server Error",
+ "DEFAULT": "Default Response",
+}
+
+
+def get_openapi_security_definitions(
+ flat_dependant: Dependant,
+) -> Tuple[Dict[str, Any], List[Dict[str, Any]]]:
+ security_definitions = {}
+ operation_security = []
+ for security_requirement in flat_dependant.security_requirements:
+ security_definition = jsonable_encoder(
+ security_requirement.security_scheme.model,
+ by_alias=True,
+ exclude_none=True,
+ )
+ security_name = security_requirement.security_scheme.scheme_name
+ security_definitions[security_name] = security_definition
+ operation_security.append({security_name: security_requirement.scopes})
+ return security_definitions, operation_security
+
+
+def get_openapi_operation_parameters(
+ *,
+ all_route_params: Sequence[ModelField],
+ model_name_map: Dict[Union[Type[BaseModel], Type[Enum]], str],
+) -> List[Dict[str, Any]]:
+ parameters = []
+ for param in all_route_params:
+ field_info = param.field_info
+ field_info = cast(Param, field_info)
+ parameter = {
+ "name": param.alias,
+ "in": field_info.in_.value,
+ "required": param.required,
+ "schema": field_schema(
+ param, model_name_map=model_name_map, ref_prefix=REF_PREFIX
+ )[0],
+ }
+ if field_info.description:
+ parameter["description"] = field_info.description
+ if field_info.deprecated:
+ parameter["deprecated"] = field_info.deprecated
+ parameters.append(parameter)
+ return parameters
+
+
+def get_openapi_operation_request_body(
+ *,
+ body_field: Optional[ModelField],
+ model_name_map: Dict[Union[Type[BaseModel], Type[Enum]], str],
+) -> Optional[Dict[str, Any]]:
+ if not body_field:
+ return None
+ assert isinstance(body_field, ModelField)
+ body_schema, _, _ = field_schema(
+ body_field, model_name_map=model_name_map, ref_prefix=REF_PREFIX
+ )
+ field_info = cast(Body, body_field.field_info)
+ request_media_type = field_info.media_type
+ required = body_field.required
+ request_body_oai: Dict[str, Any] = {}
+ if required:
+ request_body_oai["required"] = required
+ request_body_oai["content"] = {request_media_type: {"schema": body_schema}}
+ return request_body_oai
+
+
+def generate_operation_id(*, route: routing.APIRoute, method: str) -> str:
+ if route.operation_id:
+ return route.operation_id
+ path: str = route.path_format
+ return generate_operation_id_for_path(name=route.name, path=path, method=method)
+
+
+def generate_operation_summary(*, route: routing.APIRoute, method: str) -> str:
+ if route.summary:
+ return route.summary
+ return route.name.replace("_", " ").title()
+
+
+def get_openapi_operation_metadata(
+ *, route: routing.APIRoute, method: str
+) -> Dict[str, Any]:
+ operation: Dict[str, Any] = {}
+ if route.tags:
+ operation["tags"] = route.tags
+ operation["summary"] = generate_operation_summary(route=route, method=method)
+ if route.description:
+ operation["description"] = route.description
+ operation["operationId"] = generate_operation_id(route=route, method=method)
+ if route.deprecated:
+ operation["deprecated"] = route.deprecated
+ return operation
+
+
+def get_openapi_path(
+ *, route: routing.APIRoute, model_name_map: Dict[type, str]
+) -> Tuple[Dict[str, Any], Dict[str, Any], Dict[str, Any]]:
+ path = {}
+ security_schemes: Dict[str, Any] = {}
+ definitions: Dict[str, Any] = {}
+ assert route.methods is not None, "Methods must be a list"
+ if isinstance(route.response_class, DefaultPlaceholder):
+ current_response_class: Type[Response] = route.response_class.value
+ else:
+ current_response_class = route.response_class
+ assert current_response_class, "A response class is needed to generate OpenAPI"
+ route_response_media_type: Optional[str] = current_response_class.media_type
+ if route.include_in_schema:
+ for method in route.methods:
+ operation = get_openapi_operation_metadata(route=route, method=method)
+ parameters: List[Dict[str, Any]] = []
+ flat_dependant = get_flat_dependant(route.dependant, skip_repeats=True)
+ security_definitions, operation_security = get_openapi_security_definitions(
+ flat_dependant=flat_dependant
+ )
+ if operation_security:
+ operation.setdefault("security", []).extend(operation_security)
+ if security_definitions:
+ security_schemes.update(security_definitions)
+ all_route_params = get_flat_params(route.dependant)
+ operation_parameters = get_openapi_operation_parameters(
+ all_route_params=all_route_params, model_name_map=model_name_map
+ )
+ parameters.extend(operation_parameters)
+ if parameters:
+ operation["parameters"] = list(
+ {param["name"]: param for param in parameters}.values()
+ )
+ if method in METHODS_WITH_BODY:
+ request_body_oai = get_openapi_operation_request_body(
+ body_field=route.body_field, model_name_map=model_name_map
+ )
+ if request_body_oai:
+ operation["requestBody"] = request_body_oai
+ if route.callbacks:
+ callbacks = {}
+ for callback in route.callbacks:
+ if isinstance(callback, routing.APIRoute):
+ (
+ cb_path,
+ cb_security_schemes,
+ cb_definitions,
+ ) = get_openapi_path(
+ route=callback, model_name_map=model_name_map
+ )
+ callbacks[callback.name] = {callback.path: cb_path}
+ operation["callbacks"] = callbacks
+ status_code = str(route.status_code)
+ operation.setdefault("responses", {}).setdefault(status_code, {})[
+ "description"
+ ] = route.response_description
+ if (
+ route_response_media_type
+ and route.status_code not in STATUS_CODES_WITH_NO_BODY
+ ):
+ response_schema = {"type": "string"}
+ if lenient_issubclass(current_response_class, JSONResponse):
+ if route.response_field:
+ response_schema, _, _ = field_schema(
+ route.response_field,
+ model_name_map=model_name_map,
+ ref_prefix=REF_PREFIX,
+ )
+ else:
+ response_schema = {}
+ operation.setdefault("responses", {}).setdefault(
+ status_code, {}
+ ).setdefault("content", {}).setdefault(route_response_media_type, {})[
+ "schema"
+ ] = response_schema
+ if route.responses:
+ operation_responses = operation.setdefault("responses", {})
+ for (
+ additional_status_code,
+ additional_response,
+ ) in route.responses.items():
+ process_response = additional_response.copy()
+ process_response.pop("model", None)
+ status_code_key = str(additional_status_code).upper()
+ if status_code_key == "DEFAULT":
+ status_code_key = "default"
+ openapi_response = operation_responses.setdefault(
+ status_code_key, {}
+ )
+ assert isinstance(
+ process_response, dict
+ ), "An additional response must be a dict"
+ field = route.response_fields.get(additional_status_code)
+ additional_field_schema: Optional[Dict[str, Any]] = None
+ if field:
+ additional_field_schema, _, _ = field_schema(
+ field, model_name_map=model_name_map, ref_prefix=REF_PREFIX
+ )
+ media_type = route_response_media_type or "application/json"
+ additional_schema = (
+ process_response.setdefault("content", {})
+ .setdefault(media_type, {})
+ .setdefault("schema", {})
+ )
+ deep_dict_update(additional_schema, additional_field_schema)
+ status_text: Optional[str] = status_code_ranges.get(
+ str(additional_status_code).upper()
+ ) or http.client.responses.get(int(additional_status_code))
+ description = (
+ process_response.get("description")
+ or openapi_response.get("description")
+ or status_text
+ or "Additional Response"
+ )
+ deep_dict_update(openapi_response, process_response)
+ openapi_response["description"] = description
+ http422 = str(HTTP_422_UNPROCESSABLE_ENTITY)
+ if (all_route_params or route.body_field) and not any(
+ [
+ status in operation["responses"]
+ for status in [http422, "4XX", "default"]
+ ]
+ ):
+ operation["responses"][http422] = {
+ "description": "Validation Error",
+ "content": {
+ "application/json": {
+ "schema": {"$ref": REF_PREFIX + "HTTPValidationError"}
+ }
+ },
+ }
+ if "ValidationError" not in definitions:
+ definitions.update(
+ {
+ "ValidationError": validation_error_definition,
+ "HTTPValidationError": validation_error_response_definition,
+ }
+ )
+ path[method.lower()] = operation
+ return path, security_schemes, definitions
+
+
+def get_flat_models_from_routes(
+ routes: Sequence[BaseRoute],
+) -> Set[Union[Type[BaseModel], Type[Enum]]]:
+ body_fields_from_routes: List[ModelField] = []
+ responses_from_routes: List[ModelField] = []
+ request_fields_from_routes: List[ModelField] = []
+ callback_flat_models: Set[Union[Type[BaseModel], Type[Enum]]] = set()
+ for route in routes:
+ if getattr(route, "include_in_schema", None) and isinstance(
+ route, routing.APIRoute
+ ):
+ if route.body_field:
+ assert isinstance(
+ route.body_field, ModelField
+ ), "A request body must be a Pydantic Field"
+ body_fields_from_routes.append(route.body_field)
+ if route.response_field:
+ responses_from_routes.append(route.response_field)
+ if route.response_fields:
+ responses_from_routes.extend(route.response_fields.values())
+ if route.callbacks:
+ callback_flat_models |= get_flat_models_from_routes(route.callbacks)
+ params = get_flat_params(route.dependant)
+ request_fields_from_routes.extend(params)
+
+ flat_models = callback_flat_models | get_flat_models_from_fields(
+ body_fields_from_routes + responses_from_routes + request_fields_from_routes,
+ known_models=set(),
+ )
+ return flat_models
+
+
+def get_openapi(
+ *,
+ title: str,
+ version: str,
+ openapi_version: str = "3.0.2",
+ description: Optional[str] = None,
+ routes: Sequence[BaseRoute],
+ tags: Optional[List[Dict[str, Any]]] = None,
+ servers: Optional[List[Dict[str, Union[str, Any]]]] = None,
+) -> Dict[str, Any]:
+ info = {"title": title, "version": version}
+ if description:
+ info["description"] = description
+ output: Dict[str, Any] = {"openapi": openapi_version, "info": info}
+ if servers:
+ output["servers"] = servers
+ components: Dict[str, Dict[str, Any]] = {}
+ paths: Dict[str, Dict[str, Any]] = {}
+ flat_models = get_flat_models_from_routes(routes)
+ model_name_map = get_model_name_map(flat_models)
+ definitions = get_model_definitions(
+ flat_models=flat_models, model_name_map=model_name_map
+ )
+ for route in routes:
+ if isinstance(route, routing.APIRoute):
+ result = get_openapi_path(route=route, model_name_map=model_name_map)
+ if result:
+ path, security_schemes, path_definitions = result
+ if path:
+ paths.setdefault(route.path_format, {}).update(path)
+ if security_schemes:
+ components.setdefault("securitySchemes", {}).update(
+ security_schemes
+ )
+ if path_definitions:
+ definitions.update(path_definitions)
+ if definitions:
+ components["schemas"] = {k: definitions[k] for k in sorted(definitions)}
+ if components:
+ output["components"] = components
+ output["paths"] = paths
+ if tags:
+ output["tags"] = tags
+ return jsonable_encoder(OpenAPI(**output), by_alias=True, exclude_none=True) # type: ignore
diff --git a/.venv/lib/python3.9/site-packages/fastapi/param_functions.py b/.venv/lib/python3.9/site-packages/fastapi/param_functions.py
new file mode 100644
index 0000000..9ebb591
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/param_functions.py
@@ -0,0 +1,253 @@
+from typing import Any, Callable, Optional, Sequence
+
+from fastapi import params
+
+
+def Path( # noqa: N802
+ default: Any,
+ *,
+ alias: Optional[str] = None,
+ title: Optional[str] = None,
+ description: Optional[str] = None,
+ gt: Optional[float] = None,
+ ge: Optional[float] = None,
+ lt: Optional[float] = None,
+ le: Optional[float] = None,
+ min_length: Optional[int] = None,
+ max_length: Optional[int] = None,
+ regex: Optional[str] = None,
+ deprecated: Optional[bool] = None,
+ **extra: Any,
+) -> Any:
+ return params.Path(
+ default=default,
+ alias=alias,
+ title=title,
+ description=description,
+ gt=gt,
+ ge=ge,
+ lt=lt,
+ le=le,
+ min_length=min_length,
+ max_length=max_length,
+ regex=regex,
+ deprecated=deprecated,
+ **extra,
+ )
+
+
+def Query( # noqa: N802
+ default: Any,
+ *,
+ alias: Optional[str] = None,
+ title: Optional[str] = None,
+ description: Optional[str] = None,
+ gt: Optional[float] = None,
+ ge: Optional[float] = None,
+ lt: Optional[float] = None,
+ le: Optional[float] = None,
+ min_length: Optional[int] = None,
+ max_length: Optional[int] = None,
+ regex: Optional[str] = None,
+ deprecated: Optional[bool] = None,
+ **extra: Any,
+) -> Any:
+ return params.Query(
+ default,
+ alias=alias,
+ title=title,
+ description=description,
+ gt=gt,
+ ge=ge,
+ lt=lt,
+ le=le,
+ min_length=min_length,
+ max_length=max_length,
+ regex=regex,
+ deprecated=deprecated,
+ **extra,
+ )
+
+
+def Header( # noqa: N802
+ default: Any,
+ *,
+ alias: Optional[str] = None,
+ convert_underscores: bool = True,
+ title: Optional[str] = None,
+ description: Optional[str] = None,
+ gt: Optional[float] = None,
+ ge: Optional[float] = None,
+ lt: Optional[float] = None,
+ le: Optional[float] = None,
+ min_length: Optional[int] = None,
+ max_length: Optional[int] = None,
+ regex: Optional[str] = None,
+ deprecated: Optional[bool] = None,
+ **extra: Any,
+) -> Any:
+ return params.Header(
+ default,
+ alias=alias,
+ convert_underscores=convert_underscores,
+ title=title,
+ description=description,
+ gt=gt,
+ ge=ge,
+ lt=lt,
+ le=le,
+ min_length=min_length,
+ max_length=max_length,
+ regex=regex,
+ deprecated=deprecated,
+ **extra,
+ )
+
+
+def Cookie( # noqa: N802
+ default: Any,
+ *,
+ alias: Optional[str] = None,
+ title: Optional[str] = None,
+ description: Optional[str] = None,
+ gt: Optional[float] = None,
+ ge: Optional[float] = None,
+ lt: Optional[float] = None,
+ le: Optional[float] = None,
+ min_length: Optional[int] = None,
+ max_length: Optional[int] = None,
+ regex: Optional[str] = None,
+ deprecated: Optional[bool] = None,
+ **extra: Any,
+) -> Any:
+ return params.Cookie(
+ default,
+ alias=alias,
+ title=title,
+ description=description,
+ gt=gt,
+ ge=ge,
+ lt=lt,
+ le=le,
+ min_length=min_length,
+ max_length=max_length,
+ regex=regex,
+ deprecated=deprecated,
+ **extra,
+ )
+
+
+def Body( # noqa: N802
+ default: Any,
+ *,
+ embed: bool = False,
+ media_type: str = "application/json",
+ alias: Optional[str] = None,
+ title: Optional[str] = None,
+ description: Optional[str] = None,
+ gt: Optional[float] = None,
+ ge: Optional[float] = None,
+ lt: Optional[float] = None,
+ le: Optional[float] = None,
+ min_length: Optional[int] = None,
+ max_length: Optional[int] = None,
+ regex: Optional[str] = None,
+ **extra: Any,
+) -> Any:
+ return params.Body(
+ default,
+ embed=embed,
+ media_type=media_type,
+ alias=alias,
+ title=title,
+ description=description,
+ gt=gt,
+ ge=ge,
+ lt=lt,
+ le=le,
+ min_length=min_length,
+ max_length=max_length,
+ regex=regex,
+ **extra,
+ )
+
+
+def Form( # noqa: N802
+ default: Any,
+ *,
+ media_type: str = "application/x-www-form-urlencoded",
+ alias: Optional[str] = None,
+ title: Optional[str] = None,
+ description: Optional[str] = None,
+ gt: Optional[float] = None,
+ ge: Optional[float] = None,
+ lt: Optional[float] = None,
+ le: Optional[float] = None,
+ min_length: Optional[int] = None,
+ max_length: Optional[int] = None,
+ regex: Optional[str] = None,
+ **extra: Any,
+) -> Any:
+ return params.Form(
+ default,
+ media_type=media_type,
+ alias=alias,
+ title=title,
+ description=description,
+ gt=gt,
+ ge=ge,
+ lt=lt,
+ le=le,
+ min_length=min_length,
+ max_length=max_length,
+ regex=regex,
+ **extra,
+ )
+
+
+def File( # noqa: N802
+ default: Any,
+ *,
+ media_type: str = "multipart/form-data",
+ alias: Optional[str] = None,
+ title: Optional[str] = None,
+ description: Optional[str] = None,
+ gt: Optional[float] = None,
+ ge: Optional[float] = None,
+ lt: Optional[float] = None,
+ le: Optional[float] = None,
+ min_length: Optional[int] = None,
+ max_length: Optional[int] = None,
+ regex: Optional[str] = None,
+ **extra: Any,
+) -> Any:
+ return params.File(
+ default,
+ media_type=media_type,
+ alias=alias,
+ title=title,
+ description=description,
+ gt=gt,
+ ge=ge,
+ lt=lt,
+ le=le,
+ min_length=min_length,
+ max_length=max_length,
+ regex=regex,
+ **extra,
+ )
+
+
+def Depends( # noqa: N802
+ dependency: Optional[Callable[..., Any]] = None, *, use_cache: bool = True
+) -> Any:
+ return params.Depends(dependency=dependency, use_cache=use_cache)
+
+
+def Security( # noqa: N802
+ dependency: Optional[Callable[..., Any]] = None,
+ *,
+ scopes: Optional[Sequence[str]] = None,
+ use_cache: bool = True,
+) -> Any:
+ return params.Security(dependency=dependency, scopes=scopes, use_cache=use_cache)
diff --git a/.venv/lib/python3.9/site-packages/fastapi/params.py b/.venv/lib/python3.9/site-packages/fastapi/params.py
new file mode 100644
index 0000000..aa3269a
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/params.py
@@ -0,0 +1,338 @@
+from enum import Enum
+from typing import Any, Callable, Optional, Sequence
+
+from pydantic.fields import FieldInfo
+
+
+class ParamTypes(Enum):
+ query = "query"
+ header = "header"
+ path = "path"
+ cookie = "cookie"
+
+
+class Param(FieldInfo):
+ in_: ParamTypes
+
+ def __init__(
+ self,
+ default: Any,
+ *,
+ alias: Optional[str] = None,
+ title: Optional[str] = None,
+ description: Optional[str] = None,
+ gt: Optional[float] = None,
+ ge: Optional[float] = None,
+ lt: Optional[float] = None,
+ le: Optional[float] = None,
+ min_length: Optional[int] = None,
+ max_length: Optional[int] = None,
+ regex: Optional[str] = None,
+ deprecated: Optional[bool] = None,
+ **extra: Any,
+ ):
+ self.deprecated = deprecated
+ super().__init__(
+ default,
+ alias=alias,
+ title=title,
+ description=description,
+ gt=gt,
+ ge=ge,
+ lt=lt,
+ le=le,
+ min_length=min_length,
+ max_length=max_length,
+ regex=regex,
+ **extra,
+ )
+
+ def __repr__(self) -> str:
+ return f"{self.__class__.__name__}({self.default})"
+
+
+class Path(Param):
+ in_ = ParamTypes.path
+
+ def __init__(
+ self,
+ default: Any,
+ *,
+ alias: Optional[str] = None,
+ title: Optional[str] = None,
+ description: Optional[str] = None,
+ gt: Optional[float] = None,
+ ge: Optional[float] = None,
+ lt: Optional[float] = None,
+ le: Optional[float] = None,
+ min_length: Optional[int] = None,
+ max_length: Optional[int] = None,
+ regex: Optional[str] = None,
+ deprecated: Optional[bool] = None,
+ **extra: Any,
+ ):
+ self.in_ = self.in_
+ super().__init__(
+ ...,
+ alias=alias,
+ title=title,
+ description=description,
+ gt=gt,
+ ge=ge,
+ lt=lt,
+ le=le,
+ min_length=min_length,
+ max_length=max_length,
+ regex=regex,
+ deprecated=deprecated,
+ **extra,
+ )
+
+
+class Query(Param):
+ in_ = ParamTypes.query
+
+ def __init__(
+ self,
+ default: Any,
+ *,
+ alias: Optional[str] = None,
+ title: Optional[str] = None,
+ description: Optional[str] = None,
+ gt: Optional[float] = None,
+ ge: Optional[float] = None,
+ lt: Optional[float] = None,
+ le: Optional[float] = None,
+ min_length: Optional[int] = None,
+ max_length: Optional[int] = None,
+ regex: Optional[str] = None,
+ deprecated: Optional[bool] = None,
+ **extra: Any,
+ ):
+ super().__init__(
+ default,
+ alias=alias,
+ title=title,
+ description=description,
+ gt=gt,
+ ge=ge,
+ lt=lt,
+ le=le,
+ min_length=min_length,
+ max_length=max_length,
+ regex=regex,
+ deprecated=deprecated,
+ **extra,
+ )
+
+
+class Header(Param):
+ in_ = ParamTypes.header
+
+ def __init__(
+ self,
+ default: Any,
+ *,
+ alias: Optional[str] = None,
+ convert_underscores: bool = True,
+ title: Optional[str] = None,
+ description: Optional[str] = None,
+ gt: Optional[float] = None,
+ ge: Optional[float] = None,
+ lt: Optional[float] = None,
+ le: Optional[float] = None,
+ min_length: Optional[int] = None,
+ max_length: Optional[int] = None,
+ regex: Optional[str] = None,
+ deprecated: Optional[bool] = None,
+ **extra: Any,
+ ):
+ self.convert_underscores = convert_underscores
+ super().__init__(
+ default,
+ alias=alias,
+ title=title,
+ description=description,
+ gt=gt,
+ ge=ge,
+ lt=lt,
+ le=le,
+ min_length=min_length,
+ max_length=max_length,
+ regex=regex,
+ deprecated=deprecated,
+ **extra,
+ )
+
+
+class Cookie(Param):
+ in_ = ParamTypes.cookie
+
+ def __init__(
+ self,
+ default: Any,
+ *,
+ alias: Optional[str] = None,
+ title: Optional[str] = None,
+ description: Optional[str] = None,
+ gt: Optional[float] = None,
+ ge: Optional[float] = None,
+ lt: Optional[float] = None,
+ le: Optional[float] = None,
+ min_length: Optional[int] = None,
+ max_length: Optional[int] = None,
+ regex: Optional[str] = None,
+ deprecated: Optional[bool] = None,
+ **extra: Any,
+ ):
+ super().__init__(
+ default,
+ alias=alias,
+ title=title,
+ description=description,
+ gt=gt,
+ ge=ge,
+ lt=lt,
+ le=le,
+ min_length=min_length,
+ max_length=max_length,
+ regex=regex,
+ deprecated=deprecated,
+ **extra,
+ )
+
+
+class Body(FieldInfo):
+ def __init__(
+ self,
+ default: Any,
+ *,
+ embed: bool = False,
+ media_type: str = "application/json",
+ alias: Optional[str] = None,
+ title: Optional[str] = None,
+ description: Optional[str] = None,
+ gt: Optional[float] = None,
+ ge: Optional[float] = None,
+ lt: Optional[float] = None,
+ le: Optional[float] = None,
+ min_length: Optional[int] = None,
+ max_length: Optional[int] = None,
+ regex: Optional[str] = None,
+ **extra: Any,
+ ):
+ self.embed = embed
+ self.media_type = media_type
+ super().__init__(
+ default,
+ alias=alias,
+ title=title,
+ description=description,
+ gt=gt,
+ ge=ge,
+ lt=lt,
+ le=le,
+ min_length=min_length,
+ max_length=max_length,
+ regex=regex,
+ **extra,
+ )
+
+ def __repr__(self) -> str:
+ return f"{self.__class__.__name__}({self.default})"
+
+
+class Form(Body):
+ def __init__(
+ self,
+ default: Any,
+ *,
+ media_type: str = "application/x-www-form-urlencoded",
+ alias: Optional[str] = None,
+ title: Optional[str] = None,
+ description: Optional[str] = None,
+ gt: Optional[float] = None,
+ ge: Optional[float] = None,
+ lt: Optional[float] = None,
+ le: Optional[float] = None,
+ min_length: Optional[int] = None,
+ max_length: Optional[int] = None,
+ regex: Optional[str] = None,
+ **extra: Any,
+ ):
+ super().__init__(
+ default,
+ embed=True,
+ media_type=media_type,
+ alias=alias,
+ title=title,
+ description=description,
+ gt=gt,
+ ge=ge,
+ lt=lt,
+ le=le,
+ min_length=min_length,
+ max_length=max_length,
+ regex=regex,
+ **extra,
+ )
+
+
+class File(Form):
+ def __init__(
+ self,
+ default: Any,
+ *,
+ media_type: str = "multipart/form-data",
+ alias: Optional[str] = None,
+ title: Optional[str] = None,
+ description: Optional[str] = None,
+ gt: Optional[float] = None,
+ ge: Optional[float] = None,
+ lt: Optional[float] = None,
+ le: Optional[float] = None,
+ min_length: Optional[int] = None,
+ max_length: Optional[int] = None,
+ regex: Optional[str] = None,
+ **extra: Any,
+ ):
+ super().__init__(
+ default,
+ media_type=media_type,
+ alias=alias,
+ title=title,
+ description=description,
+ gt=gt,
+ ge=ge,
+ lt=lt,
+ le=le,
+ min_length=min_length,
+ max_length=max_length,
+ regex=regex,
+ **extra,
+ )
+
+
+class Depends:
+ def __init__(
+ self, dependency: Optional[Callable[..., Any]] = None, *, use_cache: bool = True
+ ):
+ self.dependency = dependency
+ self.use_cache = use_cache
+
+ def __repr__(self) -> str:
+ attr = getattr(self.dependency, "__name__", type(self.dependency).__name__)
+ cache = "" if self.use_cache else ", use_cache=False"
+ return f"{self.__class__.__name__}({attr}{cache})"
+
+
+class Security(Depends):
+ def __init__(
+ self,
+ dependency: Optional[Callable[..., Any]] = None,
+ *,
+ scopes: Optional[Sequence[str]] = None,
+ use_cache: bool = True,
+ ):
+ super().__init__(dependency=dependency, use_cache=use_cache)
+ self.scopes = scopes or []
diff --git a/.venv/lib/python3.9/site-packages/fastapi/py.typed b/.venv/lib/python3.9/site-packages/fastapi/py.typed
new file mode 100644
index 0000000..e69de29
diff --git a/.venv/lib/python3.9/site-packages/fastapi/requests.py b/.venv/lib/python3.9/site-packages/fastapi/requests.py
new file mode 100644
index 0000000..d16552c
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/requests.py
@@ -0,0 +1,2 @@
+from starlette.requests import HTTPConnection as HTTPConnection # noqa: F401
+from starlette.requests import Request as Request # noqa: F401
diff --git a/.venv/lib/python3.9/site-packages/fastapi/responses.py b/.venv/lib/python3.9/site-packages/fastapi/responses.py
new file mode 100644
index 0000000..8d9d62d
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/responses.py
@@ -0,0 +1,23 @@
+from typing import Any
+
+from starlette.responses import FileResponse as FileResponse # noqa
+from starlette.responses import HTMLResponse as HTMLResponse # noqa
+from starlette.responses import JSONResponse as JSONResponse # noqa
+from starlette.responses import PlainTextResponse as PlainTextResponse # noqa
+from starlette.responses import RedirectResponse as RedirectResponse # noqa
+from starlette.responses import Response as Response # noqa
+from starlette.responses import StreamingResponse as StreamingResponse # noqa
+from starlette.responses import UJSONResponse as UJSONResponse # noqa
+
+try:
+ import orjson
+except ImportError: # pragma: nocover
+ orjson = None # type: ignore
+
+
+class ORJSONResponse(JSONResponse):
+ media_type = "application/json"
+
+ def render(self, content: Any) -> bytes:
+ assert orjson is not None, "orjson must be installed to use ORJSONResponse"
+ return orjson.dumps(content)
diff --git a/.venv/lib/python3.9/site-packages/fastapi/routing.py b/.venv/lib/python3.9/site-packages/fastapi/routing.py
new file mode 100644
index 0000000..ac5e19d
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/routing.py
@@ -0,0 +1,1101 @@
+import asyncio
+import enum
+import inspect
+import json
+from typing import (
+ Any,
+ Callable,
+ Coroutine,
+ Dict,
+ List,
+ Optional,
+ Sequence,
+ Set,
+ Type,
+ Union,
+)
+
+from fastapi import params
+from fastapi.datastructures import Default, DefaultPlaceholder
+from fastapi.dependencies.models import Dependant
+from fastapi.dependencies.utils import (
+ get_body_field,
+ get_dependant,
+ get_parameterless_sub_dependant,
+ solve_dependencies,
+)
+from fastapi.encoders import DictIntStrAny, SetIntStr, jsonable_encoder
+from fastapi.exceptions import RequestValidationError, WebSocketRequestValidationError
+from fastapi.openapi.constants import STATUS_CODES_WITH_NO_BODY
+from fastapi.types import DecoratedCallable
+from fastapi.utils import (
+ create_cloned_field,
+ create_response_field,
+ generate_operation_id_for_path,
+ get_value_or_default,
+)
+from pydantic import BaseModel
+from pydantic.error_wrappers import ErrorWrapper, ValidationError
+from pydantic.fields import ModelField
+from starlette import routing
+from starlette.concurrency import run_in_threadpool
+from starlette.exceptions import HTTPException
+from starlette.requests import Request
+from starlette.responses import JSONResponse, Response
+from starlette.routing import BaseRoute
+from starlette.routing import Mount as Mount # noqa
+from starlette.routing import (
+ compile_path,
+ get_name,
+ request_response,
+ websocket_session,
+)
+from starlette.status import WS_1008_POLICY_VIOLATION
+from starlette.types import ASGIApp
+from starlette.websockets import WebSocket
+
+
+def _prepare_response_content(
+ res: Any,
+ *,
+ exclude_unset: bool,
+ exclude_defaults: bool = False,
+ exclude_none: bool = False,
+) -> Any:
+ if isinstance(res, BaseModel):
+ return res.dict(
+ by_alias=True,
+ exclude_unset=exclude_unset,
+ exclude_defaults=exclude_defaults,
+ exclude_none=exclude_none,
+ )
+ elif isinstance(res, list):
+ return [
+ _prepare_response_content(
+ item,
+ exclude_unset=exclude_unset,
+ exclude_defaults=exclude_defaults,
+ exclude_none=exclude_none,
+ )
+ for item in res
+ ]
+ elif isinstance(res, dict):
+ return {
+ k: _prepare_response_content(
+ v,
+ exclude_unset=exclude_unset,
+ exclude_defaults=exclude_defaults,
+ exclude_none=exclude_none,
+ )
+ for k, v in res.items()
+ }
+ return res
+
+
+async def serialize_response(
+ *,
+ field: Optional[ModelField] = None,
+ response_content: Any,
+ include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ by_alias: bool = True,
+ exclude_unset: bool = False,
+ exclude_defaults: bool = False,
+ exclude_none: bool = False,
+ is_coroutine: bool = True,
+) -> Any:
+ if field:
+ errors = []
+ response_content = _prepare_response_content(
+ response_content,
+ exclude_unset=exclude_unset,
+ exclude_defaults=exclude_defaults,
+ exclude_none=exclude_none,
+ )
+ if is_coroutine:
+ value, errors_ = field.validate(response_content, {}, loc=("response",))
+ else:
+ value, errors_ = await run_in_threadpool(
+ field.validate, response_content, {}, loc=("response",)
+ )
+ if isinstance(errors_, ErrorWrapper):
+ errors.append(errors_)
+ elif isinstance(errors_, list):
+ errors.extend(errors_)
+ if errors:
+ raise ValidationError(errors, field.type_)
+ return jsonable_encoder(
+ value,
+ include=include,
+ exclude=exclude,
+ by_alias=by_alias,
+ exclude_unset=exclude_unset,
+ exclude_defaults=exclude_defaults,
+ exclude_none=exclude_none,
+ )
+ else:
+ return jsonable_encoder(response_content)
+
+
+async def run_endpoint_function(
+ *, dependant: Dependant, values: Dict[str, Any], is_coroutine: bool
+) -> Any:
+ # Only called by get_request_handler. Has been split into its own function to
+ # facilitate profiling endpoints, since inner functions are harder to profile.
+ assert dependant.call is not None, "dependant.call must be a function"
+
+ if is_coroutine:
+ return await dependant.call(**values)
+ else:
+ return await run_in_threadpool(dependant.call, **values)
+
+
+def get_request_handler(
+ dependant: Dependant,
+ body_field: Optional[ModelField] = None,
+ status_code: int = 200,
+ response_class: Union[Type[Response], DefaultPlaceholder] = Default(JSONResponse),
+ response_field: Optional[ModelField] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ dependency_overrides_provider: Optional[Any] = None,
+) -> Callable[[Request], Coroutine[Any, Any, Response]]:
+ assert dependant.call is not None, "dependant.call must be a function"
+ is_coroutine = asyncio.iscoroutinefunction(dependant.call)
+ is_body_form = body_field and isinstance(body_field.field_info, params.Form)
+ if isinstance(response_class, DefaultPlaceholder):
+ actual_response_class: Type[Response] = response_class.value
+ else:
+ actual_response_class = response_class
+
+ async def app(request: Request) -> Response:
+ try:
+ body = None
+ if body_field:
+ if is_body_form:
+ body = await request.form()
+ else:
+ body_bytes = await request.body()
+ if body_bytes:
+ body = await request.json()
+ except json.JSONDecodeError as e:
+ raise RequestValidationError([ErrorWrapper(e, ("body", e.pos))], body=e.doc)
+ except Exception as e:
+ raise HTTPException(
+ status_code=400, detail="There was an error parsing the body"
+ ) from e
+ solved_result = await solve_dependencies(
+ request=request,
+ dependant=dependant,
+ body=body,
+ dependency_overrides_provider=dependency_overrides_provider,
+ )
+ values, errors, background_tasks, sub_response, _ = solved_result
+ if errors:
+ raise RequestValidationError(errors, body=body)
+ else:
+ raw_response = await run_endpoint_function(
+ dependant=dependant, values=values, is_coroutine=is_coroutine
+ )
+
+ if isinstance(raw_response, Response):
+ if raw_response.background is None:
+ raw_response.background = background_tasks
+ return raw_response
+ response_data = await serialize_response(
+ field=response_field,
+ response_content=raw_response,
+ include=response_model_include,
+ exclude=response_model_exclude,
+ by_alias=response_model_by_alias,
+ exclude_unset=response_model_exclude_unset,
+ exclude_defaults=response_model_exclude_defaults,
+ exclude_none=response_model_exclude_none,
+ is_coroutine=is_coroutine,
+ )
+ response = actual_response_class(
+ content=response_data,
+ status_code=status_code,
+ background=background_tasks, # type: ignore # in Starlette
+ )
+ response.headers.raw.extend(sub_response.headers.raw)
+ if sub_response.status_code:
+ response.status_code = sub_response.status_code
+ return response
+
+ return app
+
+
+def get_websocket_app(
+ dependant: Dependant, dependency_overrides_provider: Optional[Any] = None
+) -> Callable[[WebSocket], Coroutine[Any, Any, Any]]:
+ async def app(websocket: WebSocket) -> None:
+ solved_result = await solve_dependencies(
+ request=websocket,
+ dependant=dependant,
+ dependency_overrides_provider=dependency_overrides_provider,
+ )
+ values, errors, _, _2, _3 = solved_result
+ if errors:
+ await websocket.close(code=WS_1008_POLICY_VIOLATION)
+ raise WebSocketRequestValidationError(errors)
+ assert dependant.call is not None, "dependant.call must be a function"
+ await dependant.call(**values)
+
+ return app
+
+
+class APIWebSocketRoute(routing.WebSocketRoute):
+ def __init__(
+ self,
+ path: str,
+ endpoint: Callable[..., Any],
+ *,
+ name: Optional[str] = None,
+ dependency_overrides_provider: Optional[Any] = None,
+ ) -> None:
+ self.path = path
+ self.endpoint = endpoint
+ self.name = get_name(endpoint) if name is None else name
+ self.dependant = get_dependant(path=path, call=self.endpoint)
+ self.app = websocket_session(
+ get_websocket_app(
+ dependant=self.dependant,
+ dependency_overrides_provider=dependency_overrides_provider,
+ )
+ )
+ self.path_regex, self.path_format, self.param_convertors = compile_path(path)
+
+
+class APIRoute(routing.Route):
+ def __init__(
+ self,
+ path: str,
+ endpoint: Callable[..., Any],
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[params.Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ name: Optional[str] = None,
+ methods: Optional[Union[Set[str], List[str]]] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Union[Type[Response], DefaultPlaceholder] = Default(
+ JSONResponse
+ ),
+ dependency_overrides_provider: Optional[Any] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> None:
+ # normalise enums e.g. http.HTTPStatus
+ if isinstance(status_code, enum.IntEnum):
+ status_code = int(status_code)
+ self.path = path
+ self.endpoint = endpoint
+ self.name = get_name(endpoint) if name is None else name
+ self.path_regex, self.path_format, self.param_convertors = compile_path(path)
+ if methods is None:
+ methods = ["GET"]
+ self.methods: Set[str] = set([method.upper() for method in methods])
+ self.unique_id = generate_operation_id_for_path(
+ name=self.name, path=self.path_format, method=list(methods)[0]
+ )
+ self.response_model = response_model
+ if self.response_model:
+ assert (
+ status_code not in STATUS_CODES_WITH_NO_BODY
+ ), f"Status code {status_code} must not have a response body"
+ response_name = "Response_" + self.unique_id
+ self.response_field = create_response_field(
+ name=response_name, type_=self.response_model
+ )
+ # Create a clone of the field, so that a Pydantic submodel is not returned
+ # as is just because it's an instance of a subclass of a more limited class
+ # e.g. UserInDB (containing hashed_password) could be a subclass of User
+ # that doesn't have the hashed_password. But because it's a subclass, it
+ # would pass the validation and be returned as is.
+ # By being a new field, no inheritance will be passed as is. A new model
+ # will be always created.
+ self.secure_cloned_response_field: Optional[
+ ModelField
+ ] = create_cloned_field(self.response_field)
+ else:
+ self.response_field = None # type: ignore
+ self.secure_cloned_response_field = None
+ self.status_code = status_code
+ self.tags = tags or []
+ if dependencies:
+ self.dependencies = list(dependencies)
+ else:
+ self.dependencies = []
+ self.summary = summary
+ self.description = description or inspect.cleandoc(self.endpoint.__doc__ or "")
+ # if a "form feed" character (page break) is found in the description text,
+ # truncate description text to the content preceding the first "form feed"
+ self.description = self.description.split("\f")[0]
+ self.response_description = response_description
+ self.responses = responses or {}
+ response_fields = {}
+ for additional_status_code, response in self.responses.items():
+ assert isinstance(response, dict), "An additional response must be a dict"
+ model = response.get("model")
+ if model:
+ assert (
+ additional_status_code not in STATUS_CODES_WITH_NO_BODY
+ ), f"Status code {additional_status_code} must not have a response body"
+ response_name = f"Response_{additional_status_code}_{self.unique_id}"
+ response_field = create_response_field(name=response_name, type_=model)
+ response_fields[additional_status_code] = response_field
+ if response_fields:
+ self.response_fields: Dict[Union[int, str], ModelField] = response_fields
+ else:
+ self.response_fields = {}
+ self.deprecated = deprecated
+ self.operation_id = operation_id
+ self.response_model_include = response_model_include
+ self.response_model_exclude = response_model_exclude
+ self.response_model_by_alias = response_model_by_alias
+ self.response_model_exclude_unset = response_model_exclude_unset
+ self.response_model_exclude_defaults = response_model_exclude_defaults
+ self.response_model_exclude_none = response_model_exclude_none
+ self.include_in_schema = include_in_schema
+ self.response_class = response_class
+
+ assert callable(endpoint), "An endpoint must be a callable"
+ self.dependant = get_dependant(path=self.path_format, call=self.endpoint)
+ for depends in self.dependencies[::-1]:
+ self.dependant.dependencies.insert(
+ 0,
+ get_parameterless_sub_dependant(depends=depends, path=self.path_format),
+ )
+ self.body_field = get_body_field(dependant=self.dependant, name=self.unique_id)
+ self.dependency_overrides_provider = dependency_overrides_provider
+ self.callbacks = callbacks
+ self.app = request_response(self.get_route_handler())
+
+ def get_route_handler(self) -> Callable[[Request], Coroutine[Any, Any, Response]]:
+ return get_request_handler(
+ dependant=self.dependant,
+ body_field=self.body_field,
+ status_code=self.status_code,
+ response_class=self.response_class,
+ response_field=self.secure_cloned_response_field,
+ response_model_include=self.response_model_include,
+ response_model_exclude=self.response_model_exclude,
+ response_model_by_alias=self.response_model_by_alias,
+ response_model_exclude_unset=self.response_model_exclude_unset,
+ response_model_exclude_defaults=self.response_model_exclude_defaults,
+ response_model_exclude_none=self.response_model_exclude_none,
+ dependency_overrides_provider=self.dependency_overrides_provider,
+ )
+
+
+class APIRouter(routing.Router):
+ def __init__(
+ self,
+ *,
+ prefix: str = "",
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[params.Depends]] = None,
+ default_response_class: Type[Response] = Default(JSONResponse),
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ routes: Optional[List[routing.BaseRoute]] = None,
+ redirect_slashes: bool = True,
+ default: Optional[ASGIApp] = None,
+ dependency_overrides_provider: Optional[Any] = None,
+ route_class: Type[APIRoute] = APIRoute,
+ on_startup: Optional[Sequence[Callable[[], Any]]] = None,
+ on_shutdown: Optional[Sequence[Callable[[], Any]]] = None,
+ deprecated: Optional[bool] = None,
+ include_in_schema: bool = True,
+ ) -> None:
+ super().__init__(
+ routes=routes, # type: ignore # in Starlette
+ redirect_slashes=redirect_slashes,
+ default=default, # type: ignore # in Starlette
+ on_startup=on_startup, # type: ignore # in Starlette
+ on_shutdown=on_shutdown, # type: ignore # in Starlette
+ )
+ if prefix:
+ assert prefix.startswith("/"), "A path prefix must start with '/'"
+ assert not prefix.endswith(
+ "/"
+ ), "A path prefix must not end with '/', as the routes will start with '/'"
+ self.prefix = prefix
+ self.tags: List[str] = tags or []
+ self.dependencies = list(dependencies or []) or []
+ self.deprecated = deprecated
+ self.include_in_schema = include_in_schema
+ self.responses = responses or {}
+ self.callbacks = callbacks or []
+ self.dependency_overrides_provider = dependency_overrides_provider
+ self.route_class = route_class
+ self.default_response_class = default_response_class
+
+ def add_api_route(
+ self,
+ path: str,
+ endpoint: Callable[..., Any],
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[params.Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ methods: Optional[Union[Set[str], List[str]]] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Union[Type[Response], DefaultPlaceholder] = Default(
+ JSONResponse
+ ),
+ name: Optional[str] = None,
+ route_class_override: Optional[Type[APIRoute]] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> None:
+ route_class = route_class_override or self.route_class
+ responses = responses or {}
+ combined_responses = {**self.responses, **responses}
+ current_response_class = get_value_or_default(
+ response_class, self.default_response_class
+ )
+ current_tags = self.tags.copy()
+ if tags:
+ current_tags.extend(tags)
+ current_dependencies = self.dependencies.copy()
+ if dependencies:
+ current_dependencies.extend(dependencies)
+ current_callbacks = self.callbacks.copy()
+ if callbacks:
+ current_callbacks.extend(callbacks)
+ route = route_class(
+ self.prefix + path,
+ endpoint=endpoint,
+ response_model=response_model,
+ status_code=status_code,
+ tags=current_tags,
+ dependencies=current_dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=combined_responses,
+ deprecated=deprecated or self.deprecated,
+ methods=methods,
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema and self.include_in_schema,
+ response_class=current_response_class,
+ name=name,
+ dependency_overrides_provider=self.dependency_overrides_provider,
+ callbacks=current_callbacks,
+ )
+ self.routes.append(route)
+
+ def api_route(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[params.Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ methods: Optional[List[str]] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ def decorator(func: DecoratedCallable) -> DecoratedCallable:
+ self.add_api_route(
+ path,
+ func,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ methods=methods,
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
+ return func
+
+ return decorator
+
+ def add_api_websocket_route(
+ self, path: str, endpoint: Callable[..., Any], name: Optional[str] = None
+ ) -> None:
+ route = APIWebSocketRoute(
+ path,
+ endpoint=endpoint,
+ name=name,
+ dependency_overrides_provider=self.dependency_overrides_provider,
+ )
+ self.routes.append(route)
+
+ def websocket(
+ self, path: str, name: Optional[str] = None
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ def decorator(func: DecoratedCallable) -> DecoratedCallable:
+ self.add_api_websocket_route(path, func, name=name)
+ return func
+
+ return decorator
+
+ def include_router(
+ self,
+ router: "APIRouter",
+ *,
+ prefix: str = "",
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[params.Depends]] = None,
+ default_response_class: Type[Response] = Default(JSONResponse),
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ deprecated: Optional[bool] = None,
+ include_in_schema: bool = True,
+ ) -> None:
+ if prefix:
+ assert prefix.startswith("/"), "A path prefix must start with '/'"
+ assert not prefix.endswith(
+ "/"
+ ), "A path prefix must not end with '/', as the routes will start with '/'"
+ else:
+ for r in router.routes:
+ path = getattr(r, "path")
+ name = getattr(r, "name", "unknown")
+ if path is not None and not path:
+ raise Exception(
+ f"Prefix and path cannot be both empty (path operation: {name})"
+ )
+ if responses is None:
+ responses = {}
+ for route in router.routes:
+ if isinstance(route, APIRoute):
+ combined_responses = {**responses, **route.responses}
+ use_response_class = get_value_or_default(
+ route.response_class,
+ router.default_response_class,
+ default_response_class,
+ self.default_response_class,
+ )
+ current_tags = []
+ if tags:
+ current_tags.extend(tags)
+ if route.tags:
+ current_tags.extend(route.tags)
+ current_dependencies: List[params.Depends] = []
+ if dependencies:
+ current_dependencies.extend(dependencies)
+ if route.dependencies:
+ current_dependencies.extend(route.dependencies)
+ current_callbacks = []
+ if callbacks:
+ current_callbacks.extend(callbacks)
+ if route.callbacks:
+ current_callbacks.extend(route.callbacks)
+ self.add_api_route(
+ prefix + route.path,
+ route.endpoint,
+ response_model=route.response_model,
+ status_code=route.status_code,
+ tags=current_tags,
+ dependencies=current_dependencies,
+ summary=route.summary,
+ description=route.description,
+ response_description=route.response_description,
+ responses=combined_responses,
+ deprecated=route.deprecated or deprecated or self.deprecated,
+ methods=route.methods,
+ operation_id=route.operation_id,
+ response_model_include=route.response_model_include,
+ response_model_exclude=route.response_model_exclude,
+ response_model_by_alias=route.response_model_by_alias,
+ response_model_exclude_unset=route.response_model_exclude_unset,
+ response_model_exclude_defaults=route.response_model_exclude_defaults,
+ response_model_exclude_none=route.response_model_exclude_none,
+ include_in_schema=route.include_in_schema
+ and self.include_in_schema
+ and include_in_schema,
+ response_class=use_response_class,
+ name=route.name,
+ route_class_override=type(route),
+ callbacks=current_callbacks,
+ )
+ elif isinstance(route, routing.Route):
+ methods = list(route.methods or []) # type: ignore # in Starlette
+ self.add_route(
+ prefix + route.path,
+ route.endpoint,
+ methods=methods,
+ include_in_schema=route.include_in_schema,
+ name=route.name,
+ )
+ elif isinstance(route, APIWebSocketRoute):
+ self.add_api_websocket_route(
+ prefix + route.path, route.endpoint, name=route.name
+ )
+ elif isinstance(route, routing.WebSocketRoute):
+ self.add_websocket_route(
+ prefix + route.path, route.endpoint, name=route.name
+ )
+ for handler in router.on_startup:
+ self.add_event_handler("startup", handler)
+ for handler in router.on_shutdown:
+ self.add_event_handler("shutdown", handler)
+
+ def get(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[params.Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ return self.api_route(
+ path=path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ methods=["GET"],
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
+
+ def put(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[params.Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ return self.api_route(
+ path=path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ methods=["PUT"],
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
+
+ def post(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[params.Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ return self.api_route(
+ path=path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ methods=["POST"],
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
+
+ def delete(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[params.Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ return self.api_route(
+ path=path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ methods=["DELETE"],
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
+
+ def options(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[params.Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ return self.api_route(
+ path=path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ methods=["OPTIONS"],
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
+
+ def head(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[params.Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ return self.api_route(
+ path=path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ methods=["HEAD"],
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
+
+ def patch(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[params.Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+ return self.api_route(
+ path=path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ methods=["PATCH"],
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
+
+ def trace(
+ self,
+ path: str,
+ *,
+ response_model: Optional[Type[Any]] = None,
+ status_code: int = 200,
+ tags: Optional[List[str]] = None,
+ dependencies: Optional[Sequence[params.Depends]] = None,
+ summary: Optional[str] = None,
+ description: Optional[str] = None,
+ response_description: str = "Successful Response",
+ responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
+ deprecated: Optional[bool] = None,
+ operation_id: Optional[str] = None,
+ response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
+ response_model_by_alias: bool = True,
+ response_model_exclude_unset: bool = False,
+ response_model_exclude_defaults: bool = False,
+ response_model_exclude_none: bool = False,
+ include_in_schema: bool = True,
+ response_class: Type[Response] = Default(JSONResponse),
+ name: Optional[str] = None,
+ callbacks: Optional[List[BaseRoute]] = None,
+ ) -> Callable[[DecoratedCallable], DecoratedCallable]:
+
+ return self.api_route(
+ path=path,
+ response_model=response_model,
+ status_code=status_code,
+ tags=tags,
+ dependencies=dependencies,
+ summary=summary,
+ description=description,
+ response_description=response_description,
+ responses=responses,
+ deprecated=deprecated,
+ methods=["TRACE"],
+ operation_id=operation_id,
+ response_model_include=response_model_include,
+ response_model_exclude=response_model_exclude,
+ response_model_by_alias=response_model_by_alias,
+ response_model_exclude_unset=response_model_exclude_unset,
+ response_model_exclude_defaults=response_model_exclude_defaults,
+ response_model_exclude_none=response_model_exclude_none,
+ include_in_schema=include_in_schema,
+ response_class=response_class,
+ name=name,
+ callbacks=callbacks,
+ )
diff --git a/.venv/lib/python3.9/site-packages/fastapi/security/__init__.py b/.venv/lib/python3.9/site-packages/fastapi/security/__init__.py
new file mode 100644
index 0000000..3aa6bf2
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/security/__init__.py
@@ -0,0 +1,15 @@
+from .api_key import APIKeyCookie as APIKeyCookie
+from .api_key import APIKeyHeader as APIKeyHeader
+from .api_key import APIKeyQuery as APIKeyQuery
+from .http import HTTPAuthorizationCredentials as HTTPAuthorizationCredentials
+from .http import HTTPBasic as HTTPBasic
+from .http import HTTPBasicCredentials as HTTPBasicCredentials
+from .http import HTTPBearer as HTTPBearer
+from .http import HTTPDigest as HTTPDigest
+from .oauth2 import OAuth2 as OAuth2
+from .oauth2 import OAuth2AuthorizationCodeBearer as OAuth2AuthorizationCodeBearer
+from .oauth2 import OAuth2PasswordBearer as OAuth2PasswordBearer
+from .oauth2 import OAuth2PasswordRequestForm as OAuth2PasswordRequestForm
+from .oauth2 import OAuth2PasswordRequestFormStrict as OAuth2PasswordRequestFormStrict
+from .oauth2 import SecurityScopes as SecurityScopes
+from .open_id_connect_url import OpenIdConnect as OpenIdConnect
diff --git a/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..3f000ee
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/api_key.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/api_key.cpython-39.pyc
new file mode 100644
index 0000000..d369118
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/api_key.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/base.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/base.cpython-39.pyc
new file mode 100644
index 0000000..38043d2
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/base.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/http.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/http.cpython-39.pyc
new file mode 100644
index 0000000..8d4b23e
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/http.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/oauth2.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/oauth2.cpython-39.pyc
new file mode 100644
index 0000000..f59aae6
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/oauth2.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/open_id_connect_url.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/open_id_connect_url.cpython-39.pyc
new file mode 100644
index 0000000..c111217
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/open_id_connect_url.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/utils.cpython-39.pyc b/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/utils.cpython-39.pyc
new file mode 100644
index 0000000..e10153e
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/fastapi/security/__pycache__/utils.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/fastapi/security/api_key.py b/.venv/lib/python3.9/site-packages/fastapi/security/api_key.py
new file mode 100644
index 0000000..e4dacb3
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/security/api_key.py
@@ -0,0 +1,71 @@
+from typing import Optional
+
+from fastapi.openapi.models import APIKey, APIKeyIn
+from fastapi.security.base import SecurityBase
+from starlette.exceptions import HTTPException
+from starlette.requests import Request
+from starlette.status import HTTP_403_FORBIDDEN
+
+
+class APIKeyBase(SecurityBase):
+ pass
+
+
+class APIKeyQuery(APIKeyBase):
+ def __init__(
+ self, *, name: str, scheme_name: Optional[str] = None, auto_error: bool = True
+ ):
+ self.model: APIKey = APIKey(**{"in": APIKeyIn.query}, name=name)
+ self.scheme_name = scheme_name or self.__class__.__name__
+ self.auto_error = auto_error
+
+ async def __call__(self, request: Request) -> Optional[str]:
+ api_key: str = request.query_params.get(self.model.name)
+ if not api_key:
+ if self.auto_error:
+ raise HTTPException(
+ status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
+ )
+ else:
+ return None
+ return api_key
+
+
+class APIKeyHeader(APIKeyBase):
+ def __init__(
+ self, *, name: str, scheme_name: Optional[str] = None, auto_error: bool = True
+ ):
+ self.model: APIKey = APIKey(**{"in": APIKeyIn.header}, name=name)
+ self.scheme_name = scheme_name or self.__class__.__name__
+ self.auto_error = auto_error
+
+ async def __call__(self, request: Request) -> Optional[str]:
+ api_key: str = request.headers.get(self.model.name)
+ if not api_key:
+ if self.auto_error:
+ raise HTTPException(
+ status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
+ )
+ else:
+ return None
+ return api_key
+
+
+class APIKeyCookie(APIKeyBase):
+ def __init__(
+ self, *, name: str, scheme_name: Optional[str] = None, auto_error: bool = True
+ ):
+ self.model: APIKey = APIKey(**{"in": APIKeyIn.cookie}, name=name)
+ self.scheme_name = scheme_name or self.__class__.__name__
+ self.auto_error = auto_error
+
+ async def __call__(self, request: Request) -> Optional[str]:
+ api_key = request.cookies.get(self.model.name)
+ if not api_key:
+ if self.auto_error:
+ raise HTTPException(
+ status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
+ )
+ else:
+ return None
+ return api_key
diff --git a/.venv/lib/python3.9/site-packages/fastapi/security/base.py b/.venv/lib/python3.9/site-packages/fastapi/security/base.py
new file mode 100644
index 0000000..c43555d
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/security/base.py
@@ -0,0 +1,6 @@
+from fastapi.openapi.models import SecurityBase as SecurityBaseModel
+
+
+class SecurityBase:
+ model: SecurityBaseModel
+ scheme_name: str
diff --git a/.venv/lib/python3.9/site-packages/fastapi/security/http.py b/.venv/lib/python3.9/site-packages/fastapi/security/http.py
new file mode 100644
index 0000000..3258bd0
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/security/http.py
@@ -0,0 +1,152 @@
+import binascii
+from base64 import b64decode
+from typing import Optional
+
+from fastapi.exceptions import HTTPException
+from fastapi.openapi.models import HTTPBase as HTTPBaseModel
+from fastapi.openapi.models import HTTPBearer as HTTPBearerModel
+from fastapi.security.base import SecurityBase
+from fastapi.security.utils import get_authorization_scheme_param
+from pydantic import BaseModel
+from starlette.requests import Request
+from starlette.status import HTTP_401_UNAUTHORIZED, HTTP_403_FORBIDDEN
+
+
+class HTTPBasicCredentials(BaseModel):
+ username: str
+ password: str
+
+
+class HTTPAuthorizationCredentials(BaseModel):
+ scheme: str
+ credentials: str
+
+
+class HTTPBase(SecurityBase):
+ def __init__(
+ self, *, scheme: str, scheme_name: Optional[str] = None, auto_error: bool = True
+ ):
+ self.model = HTTPBaseModel(scheme=scheme)
+ self.scheme_name = scheme_name or self.__class__.__name__
+ self.auto_error = auto_error
+
+ async def __call__(
+ self, request: Request
+ ) -> Optional[HTTPAuthorizationCredentials]:
+ authorization: str = request.headers.get("Authorization")
+ scheme, credentials = get_authorization_scheme_param(authorization)
+ if not (authorization and scheme and credentials):
+ if self.auto_error:
+ raise HTTPException(
+ status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
+ )
+ else:
+ return None
+ return HTTPAuthorizationCredentials(scheme=scheme, credentials=credentials)
+
+
+class HTTPBasic(HTTPBase):
+ def __init__(
+ self,
+ *,
+ scheme_name: Optional[str] = None,
+ realm: Optional[str] = None,
+ auto_error: bool = True,
+ ):
+ self.model = HTTPBaseModel(scheme="basic")
+ self.scheme_name = scheme_name or self.__class__.__name__
+ self.realm = realm
+ self.auto_error = auto_error
+
+ async def __call__( # type: ignore
+ self, request: Request
+ ) -> Optional[HTTPBasicCredentials]:
+ authorization: str = request.headers.get("Authorization")
+ scheme, param = get_authorization_scheme_param(authorization)
+ if self.realm:
+ unauthorized_headers = {"WWW-Authenticate": f'Basic realm="{self.realm}"'}
+ else:
+ unauthorized_headers = {"WWW-Authenticate": "Basic"}
+ invalid_user_credentials_exc = HTTPException(
+ status_code=HTTP_401_UNAUTHORIZED,
+ detail="Invalid authentication credentials",
+ headers=unauthorized_headers,
+ )
+ if not authorization or scheme.lower() != "basic":
+ if self.auto_error:
+ raise HTTPException(
+ status_code=HTTP_401_UNAUTHORIZED,
+ detail="Not authenticated",
+ headers=unauthorized_headers,
+ )
+ else:
+ return None
+ try:
+ data = b64decode(param).decode("ascii")
+ except (ValueError, UnicodeDecodeError, binascii.Error):
+ raise invalid_user_credentials_exc
+ username, separator, password = data.partition(":")
+ if not separator:
+ raise invalid_user_credentials_exc
+ return HTTPBasicCredentials(username=username, password=password)
+
+
+class HTTPBearer(HTTPBase):
+ def __init__(
+ self,
+ *,
+ bearerFormat: Optional[str] = None,
+ scheme_name: Optional[str] = None,
+ auto_error: bool = True,
+ ):
+ self.model = HTTPBearerModel(bearerFormat=bearerFormat)
+ self.scheme_name = scheme_name or self.__class__.__name__
+ self.auto_error = auto_error
+
+ async def __call__(
+ self, request: Request
+ ) -> Optional[HTTPAuthorizationCredentials]:
+ authorization: str = request.headers.get("Authorization")
+ scheme, credentials = get_authorization_scheme_param(authorization)
+ if not (authorization and scheme and credentials):
+ if self.auto_error:
+ raise HTTPException(
+ status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
+ )
+ else:
+ return None
+ if scheme.lower() != "bearer":
+ if self.auto_error:
+ raise HTTPException(
+ status_code=HTTP_403_FORBIDDEN,
+ detail="Invalid authentication credentials",
+ )
+ else:
+ return None
+ return HTTPAuthorizationCredentials(scheme=scheme, credentials=credentials)
+
+
+class HTTPDigest(HTTPBase):
+ def __init__(self, *, scheme_name: Optional[str] = None, auto_error: bool = True):
+ self.model = HTTPBaseModel(scheme="digest")
+ self.scheme_name = scheme_name or self.__class__.__name__
+ self.auto_error = auto_error
+
+ async def __call__(
+ self, request: Request
+ ) -> Optional[HTTPAuthorizationCredentials]:
+ authorization: str = request.headers.get("Authorization")
+ scheme, credentials = get_authorization_scheme_param(authorization)
+ if not (authorization and scheme and credentials):
+ if self.auto_error:
+ raise HTTPException(
+ status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
+ )
+ else:
+ return None
+ if scheme.lower() != "digest":
+ raise HTTPException(
+ status_code=HTTP_403_FORBIDDEN,
+ detail="Invalid authentication credentials",
+ )
+ return HTTPAuthorizationCredentials(scheme=scheme, credentials=credentials)
diff --git a/.venv/lib/python3.9/site-packages/fastapi/security/oauth2.py b/.venv/lib/python3.9/site-packages/fastapi/security/oauth2.py
new file mode 100644
index 0000000..46571ad
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/security/oauth2.py
@@ -0,0 +1,207 @@
+from typing import Any, Dict, List, Optional, Union
+
+from fastapi.exceptions import HTTPException
+from fastapi.openapi.models import OAuth2 as OAuth2Model
+from fastapi.openapi.models import OAuthFlows as OAuthFlowsModel
+from fastapi.param_functions import Form
+from fastapi.security.base import SecurityBase
+from fastapi.security.utils import get_authorization_scheme_param
+from starlette.requests import Request
+from starlette.status import HTTP_401_UNAUTHORIZED, HTTP_403_FORBIDDEN
+
+
+class OAuth2PasswordRequestForm:
+ """
+ This is a dependency class, use it like:
+
+ @app.post("/login")
+ def login(form_data: OAuth2PasswordRequestForm = Depends()):
+ data = form_data.parse()
+ print(data.username)
+ print(data.password)
+ for scope in data.scopes:
+ print(scope)
+ if data.client_id:
+ print(data.client_id)
+ if data.client_secret:
+ print(data.client_secret)
+ return data
+
+
+ It creates the following Form request parameters in your endpoint:
+
+ grant_type: the OAuth2 spec says it is required and MUST be the fixed string "password".
+ Nevertheless, this dependency class is permissive and allows not passing it. If you want to enforce it,
+ use instead the OAuth2PasswordRequestFormStrict dependency.
+ username: username string. The OAuth2 spec requires the exact field name "username".
+ password: password string. The OAuth2 spec requires the exact field name "password".
+ scope: Optional string. Several scopes (each one a string) separated by spaces. E.g.
+ "items:read items:write users:read profile openid"
+ client_id: optional string. OAuth2 recommends sending the client_id and client_secret (if any)
+ using HTTP Basic auth, as: client_id:client_secret
+ client_secret: optional string. OAuth2 recommends sending the client_id and client_secret (if any)
+ using HTTP Basic auth, as: client_id:client_secret
+ """
+
+ def __init__(
+ self,
+ grant_type: str = Form(None, regex="password"),
+ username: str = Form(...),
+ password: str = Form(...),
+ scope: str = Form(""),
+ client_id: Optional[str] = Form(None),
+ client_secret: Optional[str] = Form(None),
+ ):
+ self.grant_type = grant_type
+ self.username = username
+ self.password = password
+ self.scopes = scope.split()
+ self.client_id = client_id
+ self.client_secret = client_secret
+
+
+class OAuth2PasswordRequestFormStrict(OAuth2PasswordRequestForm):
+ """
+ This is a dependency class, use it like:
+
+ @app.post("/login")
+ def login(form_data: OAuth2PasswordRequestFormStrict = Depends()):
+ data = form_data.parse()
+ print(data.username)
+ print(data.password)
+ for scope in data.scopes:
+ print(scope)
+ if data.client_id:
+ print(data.client_id)
+ if data.client_secret:
+ print(data.client_secret)
+ return data
+
+
+ It creates the following Form request parameters in your endpoint:
+
+ grant_type: the OAuth2 spec says it is required and MUST be the fixed string "password".
+ This dependency is strict about it. If you want to be permissive, use instead the
+ OAuth2PasswordRequestForm dependency class.
+ username: username string. The OAuth2 spec requires the exact field name "username".
+ password: password string. The OAuth2 spec requires the exact field name "password".
+ scope: Optional string. Several scopes (each one a string) separated by spaces. E.g.
+ "items:read items:write users:read profile openid"
+ client_id: optional string. OAuth2 recommends sending the client_id and client_secret (if any)
+ using HTTP Basic auth, as: client_id:client_secret
+ client_secret: optional string. OAuth2 recommends sending the client_id and client_secret (if any)
+ using HTTP Basic auth, as: client_id:client_secret
+ """
+
+ def __init__(
+ self,
+ grant_type: str = Form(..., regex="password"),
+ username: str = Form(...),
+ password: str = Form(...),
+ scope: str = Form(""),
+ client_id: Optional[str] = Form(None),
+ client_secret: Optional[str] = Form(None),
+ ):
+ super().__init__(
+ grant_type=grant_type,
+ username=username,
+ password=password,
+ scope=scope,
+ client_id=client_id,
+ client_secret=client_secret,
+ )
+
+
+class OAuth2(SecurityBase):
+ def __init__(
+ self,
+ *,
+ flows: Union[OAuthFlowsModel, Dict[str, Dict[str, Any]]] = OAuthFlowsModel(),
+ scheme_name: Optional[str] = None,
+ auto_error: Optional[bool] = True
+ ):
+ self.model = OAuth2Model(flows=flows)
+ self.scheme_name = scheme_name or self.__class__.__name__
+ self.auto_error = auto_error
+
+ async def __call__(self, request: Request) -> Optional[str]:
+ authorization: str = request.headers.get("Authorization")
+ if not authorization:
+ if self.auto_error:
+ raise HTTPException(
+ status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
+ )
+ else:
+ return None
+ return authorization
+
+
+class OAuth2PasswordBearer(OAuth2):
+ def __init__(
+ self,
+ tokenUrl: str,
+ scheme_name: Optional[str] = None,
+ scopes: Optional[Dict[str, str]] = None,
+ auto_error: bool = True,
+ ):
+ if not scopes:
+ scopes = {}
+ flows = OAuthFlowsModel(password={"tokenUrl": tokenUrl, "scopes": scopes})
+ super().__init__(flows=flows, scheme_name=scheme_name, auto_error=auto_error)
+
+ async def __call__(self, request: Request) -> Optional[str]:
+ authorization: str = request.headers.get("Authorization")
+ scheme, param = get_authorization_scheme_param(authorization)
+ if not authorization or scheme.lower() != "bearer":
+ if self.auto_error:
+ raise HTTPException(
+ status_code=HTTP_401_UNAUTHORIZED,
+ detail="Not authenticated",
+ headers={"WWW-Authenticate": "Bearer"},
+ )
+ else:
+ return None
+ return param
+
+
+class OAuth2AuthorizationCodeBearer(OAuth2):
+ def __init__(
+ self,
+ authorizationUrl: str,
+ tokenUrl: str,
+ refreshUrl: Optional[str] = None,
+ scheme_name: Optional[str] = None,
+ scopes: Optional[Dict[str, str]] = None,
+ auto_error: bool = True,
+ ):
+ if not scopes:
+ scopes = {}
+ flows = OAuthFlowsModel(
+ authorizationCode={
+ "authorizationUrl": authorizationUrl,
+ "tokenUrl": tokenUrl,
+ "refreshUrl": refreshUrl,
+ "scopes": scopes,
+ }
+ )
+ super().__init__(flows=flows, scheme_name=scheme_name, auto_error=auto_error)
+
+ async def __call__(self, request: Request) -> Optional[str]:
+ authorization: str = request.headers.get("Authorization")
+ scheme, param = get_authorization_scheme_param(authorization)
+ if not authorization or scheme.lower() != "bearer":
+ if self.auto_error:
+ raise HTTPException(
+ status_code=HTTP_401_UNAUTHORIZED,
+ detail="Not authenticated",
+ headers={"WWW-Authenticate": "Bearer"},
+ )
+ else:
+ return None # pragma: nocover
+ return param
+
+
+class SecurityScopes:
+ def __init__(self, scopes: Optional[List[str]] = None):
+ self.scopes = scopes or []
+ self.scope_str = " ".join(self.scopes)
diff --git a/.venv/lib/python3.9/site-packages/fastapi/security/open_id_connect_url.py b/.venv/lib/python3.9/site-packages/fastapi/security/open_id_connect_url.py
new file mode 100644
index 0000000..a98c13f
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/security/open_id_connect_url.py
@@ -0,0 +1,31 @@
+from typing import Optional
+
+from fastapi.openapi.models import OpenIdConnect as OpenIdConnectModel
+from fastapi.security.base import SecurityBase
+from starlette.exceptions import HTTPException
+from starlette.requests import Request
+from starlette.status import HTTP_403_FORBIDDEN
+
+
+class OpenIdConnect(SecurityBase):
+ def __init__(
+ self,
+ *,
+ openIdConnectUrl: str,
+ scheme_name: Optional[str] = None,
+ auto_error: bool = True
+ ):
+ self.model = OpenIdConnectModel(openIdConnectUrl=openIdConnectUrl)
+ self.scheme_name = scheme_name or self.__class__.__name__
+ self.auto_error = auto_error
+
+ async def __call__(self, request: Request) -> Optional[str]:
+ authorization: str = request.headers.get("Authorization")
+ if not authorization:
+ if self.auto_error:
+ raise HTTPException(
+ status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
+ )
+ else:
+ return None
+ return authorization
diff --git a/.venv/lib/python3.9/site-packages/fastapi/security/utils.py b/.venv/lib/python3.9/site-packages/fastapi/security/utils.py
new file mode 100644
index 0000000..2da0dd2
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/security/utils.py
@@ -0,0 +1,8 @@
+from typing import Tuple
+
+
+def get_authorization_scheme_param(authorization_header_value: str) -> Tuple[str, str]:
+ if not authorization_header_value:
+ return "", ""
+ scheme, _, param = authorization_header_value.partition(" ")
+ return scheme, param
diff --git a/.venv/lib/python3.9/site-packages/fastapi/staticfiles.py b/.venv/lib/python3.9/site-packages/fastapi/staticfiles.py
new file mode 100644
index 0000000..299015d
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/staticfiles.py
@@ -0,0 +1 @@
+from starlette.staticfiles import StaticFiles as StaticFiles # noqa
diff --git a/.venv/lib/python3.9/site-packages/fastapi/templating.py b/.venv/lib/python3.9/site-packages/fastapi/templating.py
new file mode 100644
index 0000000..0cb8684
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/templating.py
@@ -0,0 +1 @@
+from starlette.templating import Jinja2Templates as Jinja2Templates # noqa
diff --git a/.venv/lib/python3.9/site-packages/fastapi/testclient.py b/.venv/lib/python3.9/site-packages/fastapi/testclient.py
new file mode 100644
index 0000000..4012406
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/testclient.py
@@ -0,0 +1 @@
+from starlette.testclient import TestClient as TestClient # noqa
diff --git a/.venv/lib/python3.9/site-packages/fastapi/types.py b/.venv/lib/python3.9/site-packages/fastapi/types.py
new file mode 100644
index 0000000..e0bca46
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/types.py
@@ -0,0 +1,3 @@
+from typing import Any, Callable, TypeVar
+
+DecoratedCallable = TypeVar("DecoratedCallable", bound=Callable[..., Any])
diff --git a/.venv/lib/python3.9/site-packages/fastapi/utils.py b/.venv/lib/python3.9/site-packages/fastapi/utils.py
new file mode 100644
index 0000000..8913d85
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/utils.py
@@ -0,0 +1,156 @@
+import functools
+import re
+from dataclasses import is_dataclass
+from enum import Enum
+from typing import Any, Dict, Optional, Set, Type, Union, cast
+
+import fastapi
+from fastapi.datastructures import DefaultPlaceholder, DefaultType
+from fastapi.openapi.constants import REF_PREFIX
+from pydantic import BaseConfig, BaseModel, create_model
+from pydantic.class_validators import Validator
+from pydantic.fields import FieldInfo, ModelField, UndefinedType
+from pydantic.schema import model_process_schema
+from pydantic.utils import lenient_issubclass
+
+
+def get_model_definitions(
+ *,
+ flat_models: Set[Union[Type[BaseModel], Type[Enum]]],
+ model_name_map: Dict[Union[Type[BaseModel], Type[Enum]], str],
+) -> Dict[str, Any]:
+ definitions: Dict[str, Dict[str, Any]] = {}
+ for model in flat_models:
+ m_schema, m_definitions, m_nested_models = model_process_schema(
+ model, model_name_map=model_name_map, ref_prefix=REF_PREFIX
+ )
+ definitions.update(m_definitions)
+ model_name = model_name_map[model]
+ definitions[model_name] = m_schema
+ return definitions
+
+
+def get_path_param_names(path: str) -> Set[str]:
+ return set(re.findall("{(.*?)}", path))
+
+
+def create_response_field(
+ name: str,
+ type_: Type[Any],
+ class_validators: Optional[Dict[str, Validator]] = None,
+ default: Optional[Any] = None,
+ required: Union[bool, UndefinedType] = False,
+ model_config: Type[BaseConfig] = BaseConfig,
+ field_info: Optional[FieldInfo] = None,
+ alias: Optional[str] = None,
+) -> ModelField:
+ """
+ Create a new response field. Raises if type_ is invalid.
+ """
+ class_validators = class_validators or {}
+ field_info = field_info or FieldInfo(None)
+
+ response_field = functools.partial(
+ ModelField,
+ name=name,
+ type_=type_,
+ class_validators=class_validators,
+ default=default,
+ required=required,
+ model_config=model_config,
+ alias=alias,
+ )
+
+ try:
+ return response_field(field_info=field_info)
+ except RuntimeError:
+ raise fastapi.exceptions.FastAPIError(
+ f"Invalid args for response field! Hint: check that {type_} is a valid pydantic field type"
+ )
+
+
+def create_cloned_field(
+ field: ModelField,
+ *,
+ cloned_types: Optional[Dict[Type[BaseModel], Type[BaseModel]]] = None,
+) -> ModelField:
+ # _cloned_types has already cloned types, to support recursive models
+ if cloned_types is None:
+ cloned_types = dict()
+ original_type = field.type_
+ if is_dataclass(original_type) and hasattr(original_type, "__pydantic_model__"):
+ original_type = original_type.__pydantic_model__
+ use_type = original_type
+ if lenient_issubclass(original_type, BaseModel):
+ original_type = cast(Type[BaseModel], original_type)
+ use_type = cloned_types.get(original_type)
+ if use_type is None:
+ use_type = create_model(original_type.__name__, __base__=original_type)
+ cloned_types[original_type] = use_type
+ for f in original_type.__fields__.values():
+ use_type.__fields__[f.name] = create_cloned_field(
+ f, cloned_types=cloned_types
+ )
+ new_field = create_response_field(name=field.name, type_=use_type)
+ new_field.has_alias = field.has_alias
+ new_field.alias = field.alias
+ new_field.class_validators = field.class_validators
+ new_field.default = field.default
+ new_field.required = field.required
+ new_field.model_config = field.model_config
+ new_field.field_info = field.field_info
+ new_field.allow_none = field.allow_none
+ new_field.validate_always = field.validate_always
+ if field.sub_fields:
+ new_field.sub_fields = [
+ create_cloned_field(sub_field, cloned_types=cloned_types)
+ for sub_field in field.sub_fields
+ ]
+ if field.key_field:
+ new_field.key_field = create_cloned_field(
+ field.key_field, cloned_types=cloned_types
+ )
+ new_field.validators = field.validators
+ new_field.pre_validators = field.pre_validators
+ new_field.post_validators = field.post_validators
+ new_field.parse_json = field.parse_json
+ new_field.shape = field.shape
+ new_field.populate_validators()
+ return new_field
+
+
+def generate_operation_id_for_path(*, name: str, path: str, method: str) -> str:
+ operation_id = name + path
+ operation_id = re.sub("[^0-9a-zA-Z_]", "_", operation_id)
+ operation_id = operation_id + "_" + method.lower()
+ return operation_id
+
+
+def deep_dict_update(main_dict: Dict[Any, Any], update_dict: Dict[Any, Any]) -> None:
+ for key in update_dict:
+ if (
+ key in main_dict
+ and isinstance(main_dict[key], dict)
+ and isinstance(update_dict[key], dict)
+ ):
+ deep_dict_update(main_dict[key], update_dict[key])
+ else:
+ main_dict[key] = update_dict[key]
+
+
+def get_value_or_default(
+ first_item: Union[DefaultPlaceholder, DefaultType],
+ *extra_items: Union[DefaultPlaceholder, DefaultType],
+) -> Union[DefaultPlaceholder, DefaultType]:
+ """
+ Pass items or `DefaultPlaceholder`s by descending priority.
+
+ The first one to _not_ be a `DefaultPlaceholder` will be returned.
+
+ Otherwise, the first item (a `DefaultPlaceholder`) will be returned.
+ """
+ items = (first_item,) + extra_items
+ for item in items:
+ if not isinstance(item, DefaultPlaceholder):
+ return item
+ return first_item
diff --git a/.venv/lib/python3.9/site-packages/fastapi/websockets.py b/.venv/lib/python3.9/site-packages/fastapi/websockets.py
new file mode 100644
index 0000000..bed672a
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/fastapi/websockets.py
@@ -0,0 +1,2 @@
+from starlette.websockets import WebSocket as WebSocket # noqa
+from starlette.websockets import WebSocketDisconnect as WebSocketDisconnect # noqa
diff --git a/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/INSTALLER b/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/INSTALLER
new file mode 100644
index 0000000..a1b589e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/INSTALLER
@@ -0,0 +1 @@
+pip
diff --git a/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/LICENSE.txt b/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/LICENSE.txt
new file mode 100644
index 0000000..8f080ea
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/LICENSE.txt
@@ -0,0 +1,22 @@
+The MIT License (MIT)
+
+Copyright (c) 2016 Nathaniel J. Smith and other contributors
+
+Permission is hereby granted, free of charge, to any person obtaining
+a copy of this software and associated documentation files (the
+"Software"), to deal in the Software without restriction, including
+without limitation the rights to use, copy, modify, merge, publish,
+distribute, sublicense, and/or sell copies of the Software, and to
+permit persons to whom the Software is furnished to do so, subject to
+the following conditions:
+
+The above copyright notice and this permission notice shall be
+included in all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
+NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
+LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
+OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
+WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
diff --git a/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/METADATA b/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/METADATA
new file mode 100644
index 0000000..5478c3c
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/METADATA
@@ -0,0 +1,194 @@
+Metadata-Version: 2.1
+Name: h11
+Version: 0.12.0
+Summary: A pure-Python, bring-your-own-I/O implementation of HTTP/1.1
+Home-page: https://github.com/python-hyper/h11
+Author: Nathaniel J. Smith
+Author-email: njs@pobox.com
+License: MIT
+Platform: UNKNOWN
+Classifier: Development Status :: 3 - Alpha
+Classifier: Intended Audience :: Developers
+Classifier: License :: OSI Approved :: MIT License
+Classifier: Programming Language :: Python :: Implementation :: CPython
+Classifier: Programming Language :: Python :: Implementation :: PyPy
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3 :: Only
+Classifier: Programming Language :: Python :: 3.6
+Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Programming Language :: Python :: 3.9
+Classifier: Topic :: Internet :: WWW/HTTP
+Classifier: Topic :: System :: Networking
+Requires-Python: >=3.6
+
+h11
+===
+
+.. image:: https://travis-ci.org/python-hyper/h11.svg?branch=master
+ :target: https://travis-ci.org/python-hyper/h11
+ :alt: Automated test status
+
+.. image:: https://codecov.io/gh/python-hyper/h11/branch/master/graph/badge.svg
+ :target: https://codecov.io/gh/python-hyper/h11
+ :alt: Test coverage
+
+.. image:: https://readthedocs.org/projects/h11/badge/?version=latest
+ :target: http://h11.readthedocs.io/en/latest/?badge=latest
+ :alt: Documentation Status
+
+This is a little HTTP/1.1 library written from scratch in Python,
+heavily inspired by `hyper-h2 `_.
+
+It's a "bring-your-own-I/O" library; h11 contains no IO code
+whatsoever. This means you can hook h11 up to your favorite network
+API, and that could be anything you want: synchronous, threaded,
+asynchronous, or your own implementation of `RFC 6214
+`_ -- h11 won't judge you.
+(Compare this to the current state of the art, where every time a `new
+network API `_ comes along then someone
+gets to start over reimplementing the entire HTTP protocol from
+scratch.) Cory Benfield made an `excellent blog post describing the
+benefits of this approach
+`_, or if you like video
+then here's his `PyCon 2016 talk on the same theme
+`_.
+
+This also means that h11 is not immediately useful out of the box:
+it's a toolkit for building programs that speak HTTP, not something
+that could directly replace ``requests`` or ``twisted.web`` or
+whatever. But h11 makes it much easier to implement something like
+``requests`` or ``twisted.web``.
+
+At a high level, working with h11 goes like this:
+
+1) First, create an ``h11.Connection`` object to track the state of a
+ single HTTP/1.1 connection.
+
+2) When you read data off the network, pass it to
+ ``conn.receive_data(...)``; you'll get back a list of objects
+ representing high-level HTTP "events".
+
+3) When you want to send a high-level HTTP event, create the
+ corresponding "event" object and pass it to ``conn.send(...)``;
+ this will give you back some bytes that you can then push out
+ through the network.
+
+For example, a client might instantiate and then send a
+``h11.Request`` object, then zero or more ``h11.Data`` objects for the
+request body (e.g., if this is a POST), and then a
+``h11.EndOfMessage`` to indicate the end of the message. Then the
+server would then send back a ``h11.Response``, some ``h11.Data``, and
+its own ``h11.EndOfMessage``. If either side violates the protocol,
+you'll get a ``h11.ProtocolError`` exception.
+
+h11 is suitable for implementing both servers and clients, and has a
+pleasantly symmetric API: the events you send as a client are exactly
+the ones that you receive as a server and vice-versa.
+
+`Here's an example of a tiny HTTP client
+`_
+
+It also has `a fine manual `_.
+
+FAQ
+---
+
+*Whyyyyy?*
+
+I wanted to play with HTTP in `Curio
+`__ and `Trio
+`__, which at the time didn't have any
+HTTP libraries. So I thought, no big deal, Python has, like, a dozen
+different implementations of HTTP, surely I can find one that's
+reusable. I didn't find one, but I did find Cory's call-to-arms
+blog-post. So I figured, well, fine, if I have to implement HTTP from
+scratch, at least I can make sure no-one *else* has to ever again.
+
+*Should I use it?*
+
+Maybe. You should be aware that it's a very young project. But, it's
+feature complete and has an exhaustive test-suite and complete docs,
+so the next step is for people to try using it and see how it goes
+:-). If you do then please let us know -- if nothing else we'll want
+to talk to you before making any incompatible changes!
+
+*What are the features/limitations?*
+
+Roughly speaking, it's trying to be a robust, complete, and non-hacky
+implementation of the first "chapter" of the HTTP/1.1 spec: `RFC 7230:
+HTTP/1.1 Message Syntax and Routing
+`_. That is, it mostly focuses on
+implementing HTTP at the level of taking bytes on and off the wire,
+and the headers related to that, and tries to be anal about spec
+conformance. It doesn't know about higher-level concerns like URL
+routing, conditional GETs, cross-origin cookie policies, or content
+negotiation. But it does know how to take care of framing,
+cross-version differences in keep-alive handling, and the "obsolete
+line folding" rule, so you can focus your energies on the hard /
+interesting parts for your application, and it tries to support the
+full specification in the sense that any useful HTTP/1.1 conformant
+application should be able to use h11.
+
+It's pure Python, and has no dependencies outside of the standard
+library.
+
+It has a test suite with 100.0% coverage for both statements and
+branches.
+
+Currently it supports Python 3 (testing on 3.6-3.9) and PyPy 3.
+The last Python 2-compatible version was h11 0.11.x.
+(Originally it had a Cython wrapper for `http-parser
+`_ and a beautiful nested state
+machine implemented with ``yield from`` to postprocess the output. But
+I had to take these out -- the new *parser* needs fewer lines-of-code
+than the old *parser wrapper*, is written in pure Python, uses no
+exotic language syntax, and has more features. It's sad, really; that
+old state machine was really slick. I just need a few sentences here
+to mourn that.)
+
+I don't know how fast it is. I haven't benchmarked or profiled it yet,
+so it's probably got a few pointless hot spots, and I've been trying
+to err on the side of simplicity and robustness instead of
+micro-optimization. But at the architectural level I tried hard to
+avoid fundamentally bad decisions, e.g., I believe that all the
+parsing algorithms remain linear-time even in the face of pathological
+input like slowloris, and there are no byte-by-byte loops. (I also
+believe that it maintains bounded memory usage in the face of
+arbitrary/pathological input.)
+
+The whole library is ~800 lines-of-code. You can read and understand
+the whole thing in less than an hour. Most of the energy invested in
+this so far has been spent on trying to keep things simple by
+minimizing special-cases and ad hoc state manipulation; even though it
+is now quite small and simple, I'm still annoyed that I haven't
+figured out how to make it even smaller and simpler. (Unfortunately,
+HTTP does not lend itself to simplicity.)
+
+The API is ~feature complete and I don't expect the general outlines
+to change much, but you can't judge an API's ergonomics until you
+actually document and use it, so I'd expect some changes in the
+details.
+
+*How do I try it?*
+
+.. code-block:: sh
+
+ $ pip install h11
+ $ git clone git@github.com:python-hyper/h11
+ $ cd h11/examples
+ $ python basic-client.py
+
+and go from there.
+
+*License?*
+
+MIT
+
+*Code of conduct?*
+
+Contributors are requested to follow our `code of conduct
+`_ in
+all project spaces.
+
+
diff --git a/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/RECORD b/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/RECORD
new file mode 100644
index 0000000..693c50a
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/RECORD
@@ -0,0 +1,51 @@
+h11-0.12.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
+h11-0.12.0.dist-info/LICENSE.txt,sha256=N9tbuFkm2yikJ6JYZ_ELEjIAOuob5pzLhRE4rbjm82E,1124
+h11-0.12.0.dist-info/METADATA,sha256=_X-4TWqWCxSJ_mDyAbZPzdxHqP290_yVu09nelJOk04,8109
+h11-0.12.0.dist-info/RECORD,,
+h11-0.12.0.dist-info/WHEEL,sha256=OqRkF0eY5GHssMorFjlbTIq072vpHpF60fIQA6lS9xA,92
+h11-0.12.0.dist-info/top_level.txt,sha256=F7dC4jl3zeh8TGHEPaWJrMbeuoWbS379Gwdi-Yvdcis,4
+h11/__init__.py,sha256=3gYpvQiX8_6-dyXaAxQt_sIYREVTz1T-zB5Lf4hjKt0,909
+h11/__pycache__/__init__.cpython-39.pyc,,
+h11/__pycache__/_abnf.cpython-39.pyc,,
+h11/__pycache__/_connection.cpython-39.pyc,,
+h11/__pycache__/_events.cpython-39.pyc,,
+h11/__pycache__/_headers.cpython-39.pyc,,
+h11/__pycache__/_readers.cpython-39.pyc,,
+h11/__pycache__/_receivebuffer.cpython-39.pyc,,
+h11/__pycache__/_state.cpython-39.pyc,,
+h11/__pycache__/_util.cpython-39.pyc,,
+h11/__pycache__/_version.cpython-39.pyc,,
+h11/__pycache__/_writers.cpython-39.pyc,,
+h11/_abnf.py,sha256=tMKqgOEkTHHp8sPd_gmU9Qowe_yXXrihct63RX2zJsg,4637
+h11/_connection.py,sha256=XFZ-LPb3C2vgF4v5ifmcJqX-a2tHkItucJ7uIGvPYZA,24964
+h11/_events.py,sha256=IJtM7i2TxKv0S-givq2b-oehPVsmsbsIelTW6NHcIvg,9834
+h11/_headers.py,sha256=P2h8Q39SIFiRS9CpYjAwo_99XKJUvLHjn0U3tnm4qHE,9130
+h11/_readers.py,sha256=DmJKQwH9Iu7U3WNljKB09d6iJIO6P2_WtylJEY3HvPY,7280
+h11/_receivebuffer.py,sha256=pMOLWjS53haaCm73O6tSWKFD_6BQQWzVLqLCm2ouvcE,5029
+h11/_state.py,sha256=Upg0_uiO_C_QNXHxLB4YUprEeoeso0i_ma12SOrrA54,12167
+h11/_util.py,sha256=Lw_CoIUMR8wjnvgKwo94FCdmFcIbRQsokmxpBV7LcTI,4387
+h11/_version.py,sha256=14wRZqPo0n2t5kFKCQLsldnyZAfOZoKPJbbwJnbGPcc,686
+h11/_writers.py,sha256=dj8HQ4Pnzq5SjkUZrgh3RKQ6-8Ecy9RKC1MjSo27y4s,4173
+h11/tests/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+h11/tests/__pycache__/__init__.cpython-39.pyc,,
+h11/tests/__pycache__/helpers.cpython-39.pyc,,
+h11/tests/__pycache__/test_against_stdlib_http.cpython-39.pyc,,
+h11/tests/__pycache__/test_connection.cpython-39.pyc,,
+h11/tests/__pycache__/test_events.cpython-39.pyc,,
+h11/tests/__pycache__/test_headers.cpython-39.pyc,,
+h11/tests/__pycache__/test_helpers.cpython-39.pyc,,
+h11/tests/__pycache__/test_io.cpython-39.pyc,,
+h11/tests/__pycache__/test_receivebuffer.cpython-39.pyc,,
+h11/tests/__pycache__/test_state.cpython-39.pyc,,
+h11/tests/__pycache__/test_util.cpython-39.pyc,,
+h11/tests/data/test-file,sha256=ZJ03Rqs98oJw29OHzJg7LlMzyGQaRAY0r3AqBeM2wVU,65
+h11/tests/helpers.py,sha256=nKheRzldPf278C81d_9_Mb9yWsYJ5udwKg_oq-fAz-U,2528
+h11/tests/test_against_stdlib_http.py,sha256=aA4oDd3_jXkapvW0ER9dbGxIiNt6Ytsfs3U2Rd5XtUc,3700
+h11/tests/test_connection.py,sha256=1WybI9IQROZ0QPtR2wQjetPIR_Jwsvw5i5j2fO7XtcI,36375
+h11/tests/test_events.py,sha256=RTPFBIg81Muc7ZoDhsLwaZxthD76R1UCzHF5nzsbM-Q,5182
+h11/tests/test_headers.py,sha256=pa-WMjCk8ZXJFABkojr2db7ZKrgNKiwl-D-hjjt6-Eg,5390
+h11/tests/test_helpers.py,sha256=mPOAiv4HtyG0_T23K_ihh1JUs0y71ykD47c9r3iVtz0,573
+h11/tests/test_io.py,sha256=oaIEAy3ktA_e1xuyP09fX_GiSlS7GKMlFhQIdkg-EhI,15494
+h11/tests/test_receivebuffer.py,sha256=nZ9_LXj3wfyOn4dkgvjnDjZeNTEtxO8-lNphAB0FVF0,3399
+h11/tests/test_state.py,sha256=JMKqA2d2wtskf7FbsAr1s9qsIul4WtwdXVAOCUJgalk,8551
+h11/tests/test_util.py,sha256=j28tMloUSuhlpUxmgvS1PRurRFSbyzWb7yCTp6qy9_Q,2710
diff --git a/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/WHEEL b/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/WHEEL
new file mode 100644
index 0000000..385faab
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/WHEEL
@@ -0,0 +1,5 @@
+Wheel-Version: 1.0
+Generator: bdist_wheel (0.36.2)
+Root-Is-Purelib: true
+Tag: py3-none-any
+
diff --git a/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/top_level.txt b/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/top_level.txt
new file mode 100644
index 0000000..0d24def
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11-0.12.0.dist-info/top_level.txt
@@ -0,0 +1 @@
+h11
diff --git a/.venv/lib/python3.9/site-packages/h11/__init__.py b/.venv/lib/python3.9/site-packages/h11/__init__.py
new file mode 100644
index 0000000..ae39e01
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/__init__.py
@@ -0,0 +1,21 @@
+# A highish-level implementation of the HTTP/1.1 wire protocol (RFC 7230),
+# containing no networking code at all, loosely modelled on hyper-h2's generic
+# implementation of HTTP/2 (and in particular the h2.connection.H2Connection
+# class). There's still a bunch of subtle details you need to get right if you
+# want to make this actually useful, because it doesn't implement all the
+# semantics to check that what you're asking to write to the wire is sensible,
+# but at least it gets you out of dealing with the wire itself.
+
+from ._connection import *
+from ._events import *
+from ._state import *
+from ._util import LocalProtocolError, ProtocolError, RemoteProtocolError
+from ._version import __version__
+
+PRODUCT_ID = "python-h11/" + __version__
+
+
+__all__ = ["ProtocolError", "LocalProtocolError", "RemoteProtocolError"]
+__all__ += _events.__all__
+__all__ += _connection.__all__
+__all__ += _state.__all__
diff --git a/.venv/lib/python3.9/site-packages/h11/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..bb80a3d
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/__pycache__/_abnf.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/__pycache__/_abnf.cpython-39.pyc
new file mode 100644
index 0000000..9a9c22c
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/__pycache__/_abnf.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/__pycache__/_connection.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/__pycache__/_connection.cpython-39.pyc
new file mode 100644
index 0000000..c645ede
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/__pycache__/_connection.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/__pycache__/_events.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/__pycache__/_events.cpython-39.pyc
new file mode 100644
index 0000000..1cad5e8
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/__pycache__/_events.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/__pycache__/_headers.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/__pycache__/_headers.cpython-39.pyc
new file mode 100644
index 0000000..545e8f6
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/__pycache__/_headers.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/__pycache__/_readers.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/__pycache__/_readers.cpython-39.pyc
new file mode 100644
index 0000000..160a284
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/__pycache__/_readers.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/__pycache__/_receivebuffer.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/__pycache__/_receivebuffer.cpython-39.pyc
new file mode 100644
index 0000000..b4a514f
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/__pycache__/_receivebuffer.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/__pycache__/_state.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/__pycache__/_state.cpython-39.pyc
new file mode 100644
index 0000000..ea29cd5
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/__pycache__/_state.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/__pycache__/_util.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/__pycache__/_util.cpython-39.pyc
new file mode 100644
index 0000000..022bbf1
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/__pycache__/_util.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/__pycache__/_version.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/__pycache__/_version.cpython-39.pyc
new file mode 100644
index 0000000..b75b929
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/__pycache__/_version.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/__pycache__/_writers.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/__pycache__/_writers.cpython-39.pyc
new file mode 100644
index 0000000..af456e2
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/__pycache__/_writers.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/_abnf.py b/.venv/lib/python3.9/site-packages/h11/_abnf.py
new file mode 100644
index 0000000..e6d49e1
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/_abnf.py
@@ -0,0 +1,129 @@
+# We use native strings for all the re patterns, to take advantage of string
+# formatting, and then convert to bytestrings when compiling the final re
+# objects.
+
+# https://svn.tools.ietf.org/svn/wg/httpbis/specs/rfc7230.html#whitespace
+# OWS = *( SP / HTAB )
+# ; optional whitespace
+OWS = r"[ \t]*"
+
+# https://svn.tools.ietf.org/svn/wg/httpbis/specs/rfc7230.html#rule.token.separators
+# token = 1*tchar
+#
+# tchar = "!" / "#" / "$" / "%" / "&" / "'" / "*"
+# / "+" / "-" / "." / "^" / "_" / "`" / "|" / "~"
+# / DIGIT / ALPHA
+# ; any VCHAR, except delimiters
+token = r"[-!#$%&'*+.^_`|~0-9a-zA-Z]+"
+
+# https://svn.tools.ietf.org/svn/wg/httpbis/specs/rfc7230.html#header.fields
+# field-name = token
+field_name = token
+
+# The standard says:
+#
+# field-value = *( field-content / obs-fold )
+# field-content = field-vchar [ 1*( SP / HTAB ) field-vchar ]
+# field-vchar = VCHAR / obs-text
+# obs-fold = CRLF 1*( SP / HTAB )
+# ; obsolete line folding
+# ; see Section 3.2.4
+#
+# https://tools.ietf.org/html/rfc5234#appendix-B.1
+#
+# VCHAR = %x21-7E
+# ; visible (printing) characters
+#
+# https://svn.tools.ietf.org/svn/wg/httpbis/specs/rfc7230.html#rule.quoted-string
+# obs-text = %x80-FF
+#
+# However, the standard definition of field-content is WRONG! It disallows
+# fields containing a single visible character surrounded by whitespace,
+# e.g. "foo a bar".
+#
+# See: https://www.rfc-editor.org/errata_search.php?rfc=7230&eid=4189
+#
+# So our definition of field_content attempts to fix it up...
+#
+# Also, we allow lots of control characters, because apparently people assume
+# that they're legal in practice (e.g., google analytics makes cookies with
+# \x01 in them!):
+# https://github.com/python-hyper/h11/issues/57
+# We still don't allow NUL or whitespace, because those are often treated as
+# meta-characters and letting them through can lead to nasty issues like SSRF.
+vchar = r"[\x21-\x7e]"
+vchar_or_obs_text = r"[^\x00\s]"
+field_vchar = vchar_or_obs_text
+field_content = r"{field_vchar}+(?:[ \t]+{field_vchar}+)*".format(**globals())
+
+# We handle obs-fold at a different level, and our fixed-up field_content
+# already grows to swallow the whole value, so ? instead of *
+field_value = r"({field_content})?".format(**globals())
+
+# header-field = field-name ":" OWS field-value OWS
+header_field = (
+ r"(?P{field_name})"
+ r":"
+ r"{OWS}"
+ r"(?P{field_value})"
+ r"{OWS}".format(**globals())
+)
+
+# https://svn.tools.ietf.org/svn/wg/httpbis/specs/rfc7230.html#request.line
+#
+# request-line = method SP request-target SP HTTP-version CRLF
+# method = token
+# HTTP-version = HTTP-name "/" DIGIT "." DIGIT
+# HTTP-name = %x48.54.54.50 ; "HTTP", case-sensitive
+#
+# request-target is complicated (see RFC 7230 sec 5.3) -- could be path, full
+# URL, host+port (for connect), or even "*", but in any case we are guaranteed
+# that it contists of the visible printing characters.
+method = token
+request_target = r"{vchar}+".format(**globals())
+http_version = r"HTTP/(?P[0-9]\.[0-9])"
+request_line = (
+ r"(?P{method})"
+ r" "
+ r"(?P{request_target})"
+ r" "
+ r"{http_version}".format(**globals())
+)
+
+# https://svn.tools.ietf.org/svn/wg/httpbis/specs/rfc7230.html#status.line
+#
+# status-line = HTTP-version SP status-code SP reason-phrase CRLF
+# status-code = 3DIGIT
+# reason-phrase = *( HTAB / SP / VCHAR / obs-text )
+status_code = r"[0-9]{3}"
+reason_phrase = r"([ \t]|{vchar_or_obs_text})*".format(**globals())
+status_line = (
+ r"{http_version}"
+ r" "
+ r"(?P{status_code})"
+ # However, there are apparently a few too many servers out there that just
+ # leave out the reason phrase:
+ # https://github.com/scrapy/scrapy/issues/345#issuecomment-281756036
+ # https://github.com/seanmonstar/httparse/issues/29
+ # so make it optional. ?: is a non-capturing group.
+ r"(?: (?P{reason_phrase}))?".format(**globals())
+)
+
+HEXDIG = r"[0-9A-Fa-f]"
+# Actually
+#
+# chunk-size = 1*HEXDIG
+#
+# but we impose an upper-limit to avoid ridiculosity. len(str(2**64)) == 20
+chunk_size = r"({HEXDIG}){{1,20}}".format(**globals())
+# Actually
+#
+# chunk-ext = *( ";" chunk-ext-name [ "=" chunk-ext-val ] )
+#
+# but we aren't parsing the things so we don't really care.
+chunk_ext = r";.*"
+chunk_header = (
+ r"(?P{chunk_size})"
+ r"(?P{chunk_ext})?"
+ r"\r\n".format(**globals())
+)
diff --git a/.venv/lib/python3.9/site-packages/h11/_connection.py b/.venv/lib/python3.9/site-packages/h11/_connection.py
new file mode 100644
index 0000000..6f796ef
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/_connection.py
@@ -0,0 +1,585 @@
+# This contains the main Connection class. Everything in h11 revolves around
+# this.
+
+from ._events import * # Import all event types
+from ._headers import get_comma_header, has_expect_100_continue, set_comma_header
+from ._readers import READERS
+from ._receivebuffer import ReceiveBuffer
+from ._state import * # Import all state sentinels
+from ._state import _SWITCH_CONNECT, _SWITCH_UPGRADE, ConnectionState
+from ._util import ( # Import the internal things we need
+ LocalProtocolError,
+ make_sentinel,
+ RemoteProtocolError,
+)
+from ._writers import WRITERS
+
+# Everything in __all__ gets re-exported as part of the h11 public API.
+__all__ = ["Connection", "NEED_DATA", "PAUSED"]
+
+NEED_DATA = make_sentinel("NEED_DATA")
+PAUSED = make_sentinel("PAUSED")
+
+# If we ever have this much buffered without it making a complete parseable
+# event, we error out. The only time we really buffer is when reading the
+# request/reponse line + headers together, so this is effectively the limit on
+# the size of that.
+#
+# Some precedents for defaults:
+# - node.js: 80 * 1024
+# - tomcat: 8 * 1024
+# - IIS: 16 * 1024
+# - Apache: <8 KiB per line>
+DEFAULT_MAX_INCOMPLETE_EVENT_SIZE = 16 * 1024
+
+# RFC 7230's rules for connection lifecycles:
+# - If either side says they want to close the connection, then the connection
+# must close.
+# - HTTP/1.1 defaults to keep-alive unless someone says Connection: close
+# - HTTP/1.0 defaults to close unless both sides say Connection: keep-alive
+# (and even this is a mess -- e.g. if you're implementing a proxy then
+# sending Connection: keep-alive is forbidden).
+#
+# We simplify life by simply not supporting keep-alive with HTTP/1.0 peers. So
+# our rule is:
+# - If someone says Connection: close, we will close
+# - If someone uses HTTP/1.0, we will close.
+def _keep_alive(event):
+ connection = get_comma_header(event.headers, b"connection")
+ if b"close" in connection:
+ return False
+ if getattr(event, "http_version", b"1.1") < b"1.1":
+ return False
+ return True
+
+
+def _body_framing(request_method, event):
+ # Called when we enter SEND_BODY to figure out framing information for
+ # this body.
+ #
+ # These are the only two events that can trigger a SEND_BODY state:
+ assert type(event) in (Request, Response)
+ # Returns one of:
+ #
+ # ("content-length", count)
+ # ("chunked", ())
+ # ("http/1.0", ())
+ #
+ # which are (lookup key, *args) for constructing body reader/writer
+ # objects.
+ #
+ # Reference: https://tools.ietf.org/html/rfc7230#section-3.3.3
+ #
+ # Step 1: some responses always have an empty body, regardless of what the
+ # headers say.
+ if type(event) is Response:
+ if (
+ event.status_code in (204, 304)
+ or request_method == b"HEAD"
+ or (request_method == b"CONNECT" and 200 <= event.status_code < 300)
+ ):
+ return ("content-length", (0,))
+ # Section 3.3.3 also lists another case -- responses with status_code
+ # < 200. For us these are InformationalResponses, not Responses, so
+ # they can't get into this function in the first place.
+ assert event.status_code >= 200
+
+ # Step 2: check for Transfer-Encoding (T-E beats C-L):
+ transfer_encodings = get_comma_header(event.headers, b"transfer-encoding")
+ if transfer_encodings:
+ assert transfer_encodings == [b"chunked"]
+ return ("chunked", ())
+
+ # Step 3: check for Content-Length
+ content_lengths = get_comma_header(event.headers, b"content-length")
+ if content_lengths:
+ return ("content-length", (int(content_lengths[0]),))
+
+ # Step 4: no applicable headers; fallback/default depends on type
+ if type(event) is Request:
+ return ("content-length", (0,))
+ else:
+ return ("http/1.0", ())
+
+
+################################################################
+#
+# The main Connection class
+#
+################################################################
+
+
+class Connection:
+ """An object encapsulating the state of an HTTP connection.
+
+ Args:
+ our_role: If you're implementing a client, pass :data:`h11.CLIENT`. If
+ you're implementing a server, pass :data:`h11.SERVER`.
+
+ max_incomplete_event_size (int):
+ The maximum number of bytes we're willing to buffer of an
+ incomplete event. In practice this mostly sets a limit on the
+ maximum size of the request/response line + headers. If this is
+ exceeded, then :meth:`next_event` will raise
+ :exc:`RemoteProtocolError`.
+
+ """
+
+ def __init__(
+ self, our_role, max_incomplete_event_size=DEFAULT_MAX_INCOMPLETE_EVENT_SIZE
+ ):
+ self._max_incomplete_event_size = max_incomplete_event_size
+ # State and role tracking
+ if our_role not in (CLIENT, SERVER):
+ raise ValueError("expected CLIENT or SERVER, not {!r}".format(our_role))
+ self.our_role = our_role
+ if our_role is CLIENT:
+ self.their_role = SERVER
+ else:
+ self.their_role = CLIENT
+ self._cstate = ConnectionState()
+
+ # Callables for converting data->events or vice-versa given the
+ # current state
+ self._writer = self._get_io_object(self.our_role, None, WRITERS)
+ self._reader = self._get_io_object(self.their_role, None, READERS)
+
+ # Holds any unprocessed received data
+ self._receive_buffer = ReceiveBuffer()
+ # If this is true, then it indicates that the incoming connection was
+ # closed *after* the end of whatever's in self._receive_buffer:
+ self._receive_buffer_closed = False
+
+ # Extra bits of state that don't fit into the state machine.
+ #
+ # These two are only used to interpret framing headers for figuring
+ # out how to read/write response bodies. their_http_version is also
+ # made available as a convenient public API.
+ self.their_http_version = None
+ self._request_method = None
+ # This is pure flow-control and doesn't at all affect the set of legal
+ # transitions, so no need to bother ConnectionState with it:
+ self.client_is_waiting_for_100_continue = False
+
+ @property
+ def states(self):
+ """A dictionary like::
+
+ {CLIENT: , SERVER: }
+
+ See :ref:`state-machine` for details.
+
+ """
+ return dict(self._cstate.states)
+
+ @property
+ def our_state(self):
+ """The current state of whichever role we are playing. See
+ :ref:`state-machine` for details.
+ """
+ return self._cstate.states[self.our_role]
+
+ @property
+ def their_state(self):
+ """The current state of whichever role we are NOT playing. See
+ :ref:`state-machine` for details.
+ """
+ return self._cstate.states[self.their_role]
+
+ @property
+ def they_are_waiting_for_100_continue(self):
+ return self.their_role is CLIENT and self.client_is_waiting_for_100_continue
+
+ def start_next_cycle(self):
+ """Attempt to reset our connection state for a new request/response
+ cycle.
+
+ If both client and server are in :data:`DONE` state, then resets them
+ both to :data:`IDLE` state in preparation for a new request/response
+ cycle on this same connection. Otherwise, raises a
+ :exc:`LocalProtocolError`.
+
+ See :ref:`keepalive-and-pipelining`.
+
+ """
+ old_states = dict(self._cstate.states)
+ self._cstate.start_next_cycle()
+ self._request_method = None
+ # self.their_http_version gets left alone, since it presumably lasts
+ # beyond a single request/response cycle
+ assert not self.client_is_waiting_for_100_continue
+ self._respond_to_state_changes(old_states)
+
+ def _process_error(self, role):
+ old_states = dict(self._cstate.states)
+ self._cstate.process_error(role)
+ self._respond_to_state_changes(old_states)
+
+ def _server_switch_event(self, event):
+ if type(event) is InformationalResponse and event.status_code == 101:
+ return _SWITCH_UPGRADE
+ if type(event) is Response:
+ if (
+ _SWITCH_CONNECT in self._cstate.pending_switch_proposals
+ and 200 <= event.status_code < 300
+ ):
+ return _SWITCH_CONNECT
+ return None
+
+ # All events go through here
+ def _process_event(self, role, event):
+ # First, pass the event through the state machine to make sure it
+ # succeeds.
+ old_states = dict(self._cstate.states)
+ if role is CLIENT and type(event) is Request:
+ if event.method == b"CONNECT":
+ self._cstate.process_client_switch_proposal(_SWITCH_CONNECT)
+ if get_comma_header(event.headers, b"upgrade"):
+ self._cstate.process_client_switch_proposal(_SWITCH_UPGRADE)
+ server_switch_event = None
+ if role is SERVER:
+ server_switch_event = self._server_switch_event(event)
+ self._cstate.process_event(role, type(event), server_switch_event)
+
+ # Then perform the updates triggered by it.
+
+ # self._request_method
+ if type(event) is Request:
+ self._request_method = event.method
+
+ # self.their_http_version
+ if role is self.their_role and type(event) in (
+ Request,
+ Response,
+ InformationalResponse,
+ ):
+ self.their_http_version = event.http_version
+
+ # Keep alive handling
+ #
+ # RFC 7230 doesn't really say what one should do if Connection: close
+ # shows up on a 1xx InformationalResponse. I think the idea is that
+ # this is not supposed to happen. In any case, if it does happen, we
+ # ignore it.
+ if type(event) in (Request, Response) and not _keep_alive(event):
+ self._cstate.process_keep_alive_disabled()
+
+ # 100-continue
+ if type(event) is Request and has_expect_100_continue(event):
+ self.client_is_waiting_for_100_continue = True
+ if type(event) in (InformationalResponse, Response):
+ self.client_is_waiting_for_100_continue = False
+ if role is CLIENT and type(event) in (Data, EndOfMessage):
+ self.client_is_waiting_for_100_continue = False
+
+ self._respond_to_state_changes(old_states, event)
+
+ def _get_io_object(self, role, event, io_dict):
+ # event may be None; it's only used when entering SEND_BODY
+ state = self._cstate.states[role]
+ if state is SEND_BODY:
+ # Special case: the io_dict has a dict of reader/writer factories
+ # that depend on the request/response framing.
+ framing_type, args = _body_framing(self._request_method, event)
+ return io_dict[SEND_BODY][framing_type](*args)
+ else:
+ # General case: the io_dict just has the appropriate reader/writer
+ # for this state
+ return io_dict.get((role, state))
+
+ # This must be called after any action that might have caused
+ # self._cstate.states to change.
+ def _respond_to_state_changes(self, old_states, event=None):
+ # Update reader/writer
+ if self.our_state != old_states[self.our_role]:
+ self._writer = self._get_io_object(self.our_role, event, WRITERS)
+ if self.their_state != old_states[self.their_role]:
+ self._reader = self._get_io_object(self.their_role, event, READERS)
+
+ @property
+ def trailing_data(self):
+ """Data that has been received, but not yet processed, represented as
+ a tuple with two elements, where the first is a byte-string containing
+ the unprocessed data itself, and the second is a bool that is True if
+ the receive connection was closed.
+
+ See :ref:`switching-protocols` for discussion of why you'd want this.
+ """
+ return (bytes(self._receive_buffer), self._receive_buffer_closed)
+
+ def receive_data(self, data):
+ """Add data to our internal receive buffer.
+
+ This does not actually do any processing on the data, just stores
+ it. To trigger processing, you have to call :meth:`next_event`.
+
+ Args:
+ data (:term:`bytes-like object`):
+ The new data that was just received.
+
+ Special case: If *data* is an empty byte-string like ``b""``,
+ then this indicates that the remote side has closed the
+ connection (end of file). Normally this is convenient, because
+ standard Python APIs like :meth:`file.read` or
+ :meth:`socket.recv` use ``b""`` to indicate end-of-file, while
+ other failures to read are indicated using other mechanisms
+ like raising :exc:`TimeoutError`. When using such an API you
+ can just blindly pass through whatever you get from ``read``
+ to :meth:`receive_data`, and everything will work.
+
+ But, if you have an API where reading an empty string is a
+ valid non-EOF condition, then you need to be aware of this and
+ make sure to check for such strings and avoid passing them to
+ :meth:`receive_data`.
+
+ Returns:
+ Nothing, but after calling this you should call :meth:`next_event`
+ to parse the newly received data.
+
+ Raises:
+ RuntimeError:
+ Raised if you pass an empty *data*, indicating EOF, and then
+ pass a non-empty *data*, indicating more data that somehow
+ arrived after the EOF.
+
+ (Calling ``receive_data(b"")`` multiple times is fine,
+ and equivalent to calling it once.)
+
+ """
+ if data:
+ if self._receive_buffer_closed:
+ raise RuntimeError("received close, then received more data?")
+ self._receive_buffer += data
+ else:
+ self._receive_buffer_closed = True
+
+ def _extract_next_receive_event(self):
+ state = self.their_state
+ # We don't pause immediately when they enter DONE, because even in
+ # DONE state we can still process a ConnectionClosed() event. But
+ # if we have data in our buffer, then we definitely aren't getting
+ # a ConnectionClosed() immediately and we need to pause.
+ if state is DONE and self._receive_buffer:
+ return PAUSED
+ if state is MIGHT_SWITCH_PROTOCOL or state is SWITCHED_PROTOCOL:
+ return PAUSED
+ assert self._reader is not None
+ event = self._reader(self._receive_buffer)
+ if event is None:
+ if not self._receive_buffer and self._receive_buffer_closed:
+ # In some unusual cases (basically just HTTP/1.0 bodies), EOF
+ # triggers an actual protocol event; in that case, we want to
+ # return that event, and then the state will change and we'll
+ # get called again to generate the actual ConnectionClosed().
+ if hasattr(self._reader, "read_eof"):
+ event = self._reader.read_eof()
+ else:
+ event = ConnectionClosed()
+ if event is None:
+ event = NEED_DATA
+ return event
+
+ def next_event(self):
+ """Parse the next event out of our receive buffer, update our internal
+ state, and return it.
+
+ This is a mutating operation -- think of it like calling :func:`next`
+ on an iterator.
+
+ Returns:
+ : One of three things:
+
+ 1) An event object -- see :ref:`events`.
+
+ 2) The special constant :data:`NEED_DATA`, which indicates that
+ you need to read more data from your socket and pass it to
+ :meth:`receive_data` before this method will be able to return
+ any more events.
+
+ 3) The special constant :data:`PAUSED`, which indicates that we
+ are not in a state where we can process incoming data (usually
+ because the peer has finished their part of the current
+ request/response cycle, and you have not yet called
+ :meth:`start_next_cycle`). See :ref:`flow-control` for details.
+
+ Raises:
+ RemoteProtocolError:
+ The peer has misbehaved. You should close the connection
+ (possibly after sending some kind of 4xx response).
+
+ Once this method returns :class:`ConnectionClosed` once, then all
+ subsequent calls will also return :class:`ConnectionClosed`.
+
+ If this method raises any exception besides :exc:`RemoteProtocolError`
+ then that's a bug -- if it happens please file a bug report!
+
+ If this method raises any exception then it also sets
+ :attr:`Connection.their_state` to :data:`ERROR` -- see
+ :ref:`error-handling` for discussion.
+
+ """
+
+ if self.their_state is ERROR:
+ raise RemoteProtocolError("Can't receive data when peer state is ERROR")
+ try:
+ event = self._extract_next_receive_event()
+ if event not in [NEED_DATA, PAUSED]:
+ self._process_event(self.their_role, event)
+ if event is NEED_DATA:
+ if len(self._receive_buffer) > self._max_incomplete_event_size:
+ # 431 is "Request header fields too large" which is pretty
+ # much the only situation where we can get here
+ raise RemoteProtocolError(
+ "Receive buffer too long", error_status_hint=431
+ )
+ if self._receive_buffer_closed:
+ # We're still trying to complete some event, but that's
+ # never going to happen because no more data is coming
+ raise RemoteProtocolError("peer unexpectedly closed connection")
+ return event
+ except BaseException as exc:
+ self._process_error(self.their_role)
+ if isinstance(exc, LocalProtocolError):
+ exc._reraise_as_remote_protocol_error()
+ else:
+ raise
+
+ def send(self, event):
+ """Convert a high-level event into bytes that can be sent to the peer,
+ while updating our internal state machine.
+
+ Args:
+ event: The :ref:`event ` to send.
+
+ Returns:
+ If ``type(event) is ConnectionClosed``, then returns
+ ``None``. Otherwise, returns a :term:`bytes-like object`.
+
+ Raises:
+ LocalProtocolError:
+ Sending this event at this time would violate our
+ understanding of the HTTP/1.1 protocol.
+
+ If this method raises any exception then it also sets
+ :attr:`Connection.our_state` to :data:`ERROR` -- see
+ :ref:`error-handling` for discussion.
+
+ """
+ data_list = self.send_with_data_passthrough(event)
+ if data_list is None:
+ return None
+ else:
+ return b"".join(data_list)
+
+ def send_with_data_passthrough(self, event):
+ """Identical to :meth:`send`, except that in situations where
+ :meth:`send` returns a single :term:`bytes-like object`, this instead
+ returns a list of them -- and when sending a :class:`Data` event, this
+ list is guaranteed to contain the exact object you passed in as
+ :attr:`Data.data`. See :ref:`sendfile` for discussion.
+
+ """
+ if self.our_state is ERROR:
+ raise LocalProtocolError("Can't send data when our state is ERROR")
+ try:
+ if type(event) is Response:
+ self._clean_up_response_headers_for_sending(event)
+ # We want to call _process_event before calling the writer,
+ # because if someone tries to do something invalid then this will
+ # give a sensible error message, while our writers all just assume
+ # they will only receive valid events. But, _process_event might
+ # change self._writer. So we have to do a little dance:
+ writer = self._writer
+ self._process_event(self.our_role, event)
+ if type(event) is ConnectionClosed:
+ return None
+ else:
+ # In any situation where writer is None, process_event should
+ # have raised ProtocolError
+ assert writer is not None
+ data_list = []
+ writer(event, data_list.append)
+ return data_list
+ except:
+ self._process_error(self.our_role)
+ raise
+
+ def send_failed(self):
+ """Notify the state machine that we failed to send the data it gave
+ us.
+
+ This causes :attr:`Connection.our_state` to immediately become
+ :data:`ERROR` -- see :ref:`error-handling` for discussion.
+
+ """
+ self._process_error(self.our_role)
+
+ # When sending a Response, we take responsibility for a few things:
+ #
+ # - Sometimes you MUST set Connection: close. We take care of those
+ # times. (You can also set it yourself if you want, and if you do then
+ # we'll respect that and close the connection at the right time. But you
+ # don't have to worry about that unless you want to.)
+ #
+ # - The user has to set Content-Length if they want it. Otherwise, for
+ # responses that have bodies (e.g. not HEAD), then we will automatically
+ # select the right mechanism for streaming a body of unknown length,
+ # which depends on depending on the peer's HTTP version.
+ #
+ # This function's *only* responsibility is making sure headers are set up
+ # right -- everything downstream just looks at the headers. There are no
+ # side channels. It mutates the response event in-place (but not the
+ # response.headers list object).
+ def _clean_up_response_headers_for_sending(self, response):
+ assert type(response) is Response
+
+ headers = response.headers
+ need_close = False
+
+ # HEAD requests need some special handling: they always act like they
+ # have Content-Length: 0, and that's how _body_framing treats
+ # them. But their headers are supposed to match what we would send if
+ # the request was a GET. (Technically there is one deviation allowed:
+ # we're allowed to leave out the framing headers -- see
+ # https://tools.ietf.org/html/rfc7231#section-4.3.2 . But it's just as
+ # easy to get them right.)
+ method_for_choosing_headers = self._request_method
+ if method_for_choosing_headers == b"HEAD":
+ method_for_choosing_headers = b"GET"
+ framing_type, _ = _body_framing(method_for_choosing_headers, response)
+ if framing_type in ("chunked", "http/1.0"):
+ # This response has a body of unknown length.
+ # If our peer is HTTP/1.1, we use Transfer-Encoding: chunked
+ # If our peer is HTTP/1.0, we use no framing headers, and close the
+ # connection afterwards.
+ #
+ # Make sure to clear Content-Length (in principle user could have
+ # set both and then we ignored Content-Length b/c
+ # Transfer-Encoding overwrote it -- this would be naughty of them,
+ # but the HTTP spec says that if our peer does this then we have
+ # to fix it instead of erroring out, so we'll accord the user the
+ # same respect).
+ headers = set_comma_header(headers, b"content-length", [])
+ if self.their_http_version is None or self.their_http_version < b"1.1":
+ # Either we never got a valid request and are sending back an
+ # error (their_http_version is None), so we assume the worst;
+ # or else we did get a valid HTTP/1.0 request, so we know that
+ # they don't understand chunked encoding.
+ headers = set_comma_header(headers, b"transfer-encoding", [])
+ # This is actually redundant ATM, since currently we
+ # unconditionally disable keep-alive when talking to HTTP/1.0
+ # peers. But let's be defensive just in case we add
+ # Connection: keep-alive support later:
+ if self._request_method != b"HEAD":
+ need_close = True
+ else:
+ headers = set_comma_header(headers, b"transfer-encoding", ["chunked"])
+
+ if not self._cstate.keep_alive or need_close:
+ # Make sure Connection: close is set
+ connection = set(get_comma_header(headers, b"connection"))
+ connection.discard(b"keep-alive")
+ connection.add(b"close")
+ headers = set_comma_header(headers, b"connection", sorted(connection))
+
+ response.headers = headers
diff --git a/.venv/lib/python3.9/site-packages/h11/_events.py b/.venv/lib/python3.9/site-packages/h11/_events.py
new file mode 100644
index 0000000..1827930
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/_events.py
@@ -0,0 +1,302 @@
+# High level events that make up HTTP/1.1 conversations. Loosely inspired by
+# the corresponding events in hyper-h2:
+#
+# http://python-hyper.org/h2/en/stable/api.html#events
+#
+# Don't subclass these. Stuff will break.
+
+import re
+
+from . import _headers
+from ._abnf import request_target
+from ._util import bytesify, LocalProtocolError, validate
+
+# Everything in __all__ gets re-exported as part of the h11 public API.
+__all__ = [
+ "Request",
+ "InformationalResponse",
+ "Response",
+ "Data",
+ "EndOfMessage",
+ "ConnectionClosed",
+]
+
+request_target_re = re.compile(request_target.encode("ascii"))
+
+
+class _EventBundle:
+ _fields = []
+ _defaults = {}
+
+ def __init__(self, **kwargs):
+ _parsed = kwargs.pop("_parsed", False)
+ allowed = set(self._fields)
+ for kwarg in kwargs:
+ if kwarg not in allowed:
+ raise TypeError(
+ "unrecognized kwarg {} for {}".format(
+ kwarg, self.__class__.__name__
+ )
+ )
+ required = allowed.difference(self._defaults)
+ for field in required:
+ if field not in kwargs:
+ raise TypeError(
+ "missing required kwarg {} for {}".format(
+ field, self.__class__.__name__
+ )
+ )
+ self.__dict__.update(self._defaults)
+ self.__dict__.update(kwargs)
+
+ # Special handling for some fields
+
+ if "headers" in self.__dict__:
+ self.headers = _headers.normalize_and_validate(
+ self.headers, _parsed=_parsed
+ )
+
+ if not _parsed:
+ for field in ["method", "target", "http_version", "reason"]:
+ if field in self.__dict__:
+ self.__dict__[field] = bytesify(self.__dict__[field])
+
+ if "status_code" in self.__dict__:
+ if not isinstance(self.status_code, int):
+ raise LocalProtocolError("status code must be integer")
+ # Because IntEnum objects are instances of int, but aren't
+ # duck-compatible (sigh), see gh-72.
+ self.status_code = int(self.status_code)
+
+ self._validate()
+
+ def _validate(self):
+ pass
+
+ def __repr__(self):
+ name = self.__class__.__name__
+ kwarg_strs = [
+ "{}={}".format(field, self.__dict__[field]) for field in self._fields
+ ]
+ kwarg_str = ", ".join(kwarg_strs)
+ return "{}({})".format(name, kwarg_str)
+
+ # Useful for tests
+ def __eq__(self, other):
+ return self.__class__ == other.__class__ and self.__dict__ == other.__dict__
+
+ # This is an unhashable type.
+ __hash__ = None
+
+
+class Request(_EventBundle):
+ """The beginning of an HTTP request.
+
+ Fields:
+
+ .. attribute:: method
+
+ An HTTP method, e.g. ``b"GET"`` or ``b"POST"``. Always a byte
+ string. :term:`Bytes-like objects ` and native
+ strings containing only ascii characters will be automatically
+ converted to byte strings.
+
+ .. attribute:: target
+
+ The target of an HTTP request, e.g. ``b"/index.html"``, or one of the
+ more exotic formats described in `RFC 7320, section 5.3
+ `_. Always a byte
+ string. :term:`Bytes-like objects ` and native
+ strings containing only ascii characters will be automatically
+ converted to byte strings.
+
+ .. attribute:: headers
+
+ Request headers, represented as a list of (name, value) pairs. See
+ :ref:`the header normalization rules ` for details.
+
+ .. attribute:: http_version
+
+ The HTTP protocol version, represented as a byte string like
+ ``b"1.1"``. See :ref:`the HTTP version normalization rules
+ ` for details.
+
+ """
+
+ _fields = ["method", "target", "headers", "http_version"]
+ _defaults = {"http_version": b"1.1"}
+
+ def _validate(self):
+ # "A server MUST respond with a 400 (Bad Request) status code to any
+ # HTTP/1.1 request message that lacks a Host header field and to any
+ # request message that contains more than one Host header field or a
+ # Host header field with an invalid field-value."
+ # -- https://tools.ietf.org/html/rfc7230#section-5.4
+ host_count = 0
+ for name, value in self.headers:
+ if name == b"host":
+ host_count += 1
+ if self.http_version == b"1.1" and host_count == 0:
+ raise LocalProtocolError("Missing mandatory Host: header")
+ if host_count > 1:
+ raise LocalProtocolError("Found multiple Host: headers")
+
+ validate(request_target_re, self.target, "Illegal target characters")
+
+
+class _ResponseBase(_EventBundle):
+ _fields = ["status_code", "headers", "http_version", "reason"]
+ _defaults = {"http_version": b"1.1", "reason": b""}
+
+
+class InformationalResponse(_ResponseBase):
+ """An HTTP informational response.
+
+ Fields:
+
+ .. attribute:: status_code
+
+ The status code of this response, as an integer. For an
+ :class:`InformationalResponse`, this is always in the range [100,
+ 200).
+
+ .. attribute:: headers
+
+ Request headers, represented as a list of (name, value) pairs. See
+ :ref:`the header normalization rules ` for
+ details.
+
+ .. attribute:: http_version
+
+ The HTTP protocol version, represented as a byte string like
+ ``b"1.1"``. See :ref:`the HTTP version normalization rules
+ ` for details.
+
+ .. attribute:: reason
+
+ The reason phrase of this response, as a byte string. For example:
+ ``b"OK"``, or ``b"Not Found"``.
+
+ """
+
+ def _validate(self):
+ if not (100 <= self.status_code < 200):
+ raise LocalProtocolError(
+ "InformationalResponse status_code should be in range "
+ "[100, 200), not {}".format(self.status_code)
+ )
+
+
+class Response(_ResponseBase):
+ """The beginning of an HTTP response.
+
+ Fields:
+
+ .. attribute:: status_code
+
+ The status code of this response, as an integer. For an
+ :class:`Response`, this is always in the range [200,
+ 600).
+
+ .. attribute:: headers
+
+ Request headers, represented as a list of (name, value) pairs. See
+ :ref:`the header normalization rules ` for details.
+
+ .. attribute:: http_version
+
+ The HTTP protocol version, represented as a byte string like
+ ``b"1.1"``. See :ref:`the HTTP version normalization rules
+ ` for details.
+
+ .. attribute:: reason
+
+ The reason phrase of this response, as a byte string. For example:
+ ``b"OK"``, or ``b"Not Found"``.
+
+ """
+
+ def _validate(self):
+ if not (200 <= self.status_code < 600):
+ raise LocalProtocolError(
+ "Response status_code should be in range [200, 600), not {}".format(
+ self.status_code
+ )
+ )
+
+
+class Data(_EventBundle):
+ """Part of an HTTP message body.
+
+ Fields:
+
+ .. attribute:: data
+
+ A :term:`bytes-like object` containing part of a message body. Or, if
+ using the ``combine=False`` argument to :meth:`Connection.send`, then
+ any object that your socket writing code knows what to do with, and for
+ which calling :func:`len` returns the number of bytes that will be
+ written -- see :ref:`sendfile` for details.
+
+ .. attribute:: chunk_start
+
+ A marker that indicates whether this data object is from the start of a
+ chunked transfer encoding chunk. This field is ignored when when a Data
+ event is provided to :meth:`Connection.send`: it is only valid on
+ events emitted from :meth:`Connection.next_event`. You probably
+ shouldn't use this attribute at all; see
+ :ref:`chunk-delimiters-are-bad` for details.
+
+ .. attribute:: chunk_end
+
+ A marker that indicates whether this data object is the last for a
+ given chunked transfer encoding chunk. This field is ignored when when
+ a Data event is provided to :meth:`Connection.send`: it is only valid
+ on events emitted from :meth:`Connection.next_event`. You probably
+ shouldn't use this attribute at all; see
+ :ref:`chunk-delimiters-are-bad` for details.
+
+ """
+
+ _fields = ["data", "chunk_start", "chunk_end"]
+ _defaults = {"chunk_start": False, "chunk_end": False}
+
+
+# XX FIXME: "A recipient MUST ignore (or consider as an error) any fields that
+# are forbidden to be sent in a trailer, since processing them as if they were
+# present in the header section might bypass external security filters."
+# https://svn.tools.ietf.org/svn/wg/httpbis/specs/rfc7230.html#chunked.trailer.part
+# Unfortunately, the list of forbidden fields is long and vague :-/
+class EndOfMessage(_EventBundle):
+ """The end of an HTTP message.
+
+ Fields:
+
+ .. attribute:: headers
+
+ Default value: ``[]``
+
+ Any trailing headers attached to this message, represented as a list of
+ (name, value) pairs. See :ref:`the header normalization rules
+ ` for details.
+
+ Must be empty unless ``Transfer-Encoding: chunked`` is in use.
+
+ """
+
+ _fields = ["headers"]
+ _defaults = {"headers": []}
+
+
+class ConnectionClosed(_EventBundle):
+ """This event indicates that the sender has closed their outgoing
+ connection.
+
+ Note that this does not necessarily mean that they can't *receive* further
+ data, because TCP connections are composed to two one-way channels which
+ can be closed independently. See :ref:`closing` for details.
+
+ No fields.
+ """
+
+ pass
diff --git a/.venv/lib/python3.9/site-packages/h11/_headers.py b/.venv/lib/python3.9/site-packages/h11/_headers.py
new file mode 100644
index 0000000..7ed39bc
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/_headers.py
@@ -0,0 +1,242 @@
+import re
+
+from ._abnf import field_name, field_value
+from ._util import bytesify, LocalProtocolError, validate
+
+# Facts
+# -----
+#
+# Headers are:
+# keys: case-insensitive ascii
+# values: mixture of ascii and raw bytes
+#
+# "Historically, HTTP has allowed field content with text in the ISO-8859-1
+# charset [ISO-8859-1], supporting other charsets only through use of
+# [RFC2047] encoding. In practice, most HTTP header field values use only a
+# subset of the US-ASCII charset [USASCII]. Newly defined header fields SHOULD
+# limit their field values to US-ASCII octets. A recipient SHOULD treat other
+# octets in field content (obs-text) as opaque data."
+# And it deprecates all non-ascii values
+#
+# Leading/trailing whitespace in header names is forbidden
+#
+# Values get leading/trailing whitespace stripped
+#
+# Content-Disposition actually needs to contain unicode semantically; to
+# accomplish this it has a terrifically weird way of encoding the filename
+# itself as ascii (and even this still has lots of cross-browser
+# incompatibilities)
+#
+# Order is important:
+# "a proxy MUST NOT change the order of these field values when forwarding a
+# message"
+# (and there are several headers where the order indicates a preference)
+#
+# Multiple occurences of the same header:
+# "A sender MUST NOT generate multiple header fields with the same field name
+# in a message unless either the entire field value for that header field is
+# defined as a comma-separated list [or the header is Set-Cookie which gets a
+# special exception]" - RFC 7230. (cookies are in RFC 6265)
+#
+# So every header aside from Set-Cookie can be merged by b", ".join if it
+# occurs repeatedly. But, of course, they can't necessarily be split by
+# .split(b","), because quoting.
+#
+# Given all this mess (case insensitive, duplicates allowed, order is
+# important, ...), there doesn't appear to be any standard way to handle
+# headers in Python -- they're almost like dicts, but... actually just
+# aren't. For now we punt and just use a super simple representation: headers
+# are a list of pairs
+#
+# [(name1, value1), (name2, value2), ...]
+#
+# where all entries are bytestrings, names are lowercase and have no
+# leading/trailing whitespace, and values are bytestrings with no
+# leading/trailing whitespace. Searching and updating are done via naive O(n)
+# methods.
+#
+# Maybe a dict-of-lists would be better?
+
+_content_length_re = re.compile(br"[0-9]+")
+_field_name_re = re.compile(field_name.encode("ascii"))
+_field_value_re = re.compile(field_value.encode("ascii"))
+
+
+class Headers:
+ """
+ A list-like interface that allows iterating over headers as byte-pairs
+ of (lowercased-name, value).
+
+ Internally we actually store the representation as three-tuples,
+ including both the raw original casing, in order to preserve casing
+ over-the-wire, and the lowercased name, for case-insensitive comparisions.
+
+ r = Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "example.org"), ("Connection", "keep-alive")],
+ http_version="1.1",
+ )
+ assert r.headers == [
+ (b"host", b"example.org"),
+ (b"connection", b"keep-alive")
+ ]
+ assert r.headers.raw_items() == [
+ (b"Host", b"example.org"),
+ (b"Connection", b"keep-alive")
+ ]
+ """
+
+ __slots__ = "_full_items"
+
+ def __init__(self, full_items):
+ self._full_items = full_items
+
+ def __iter__(self):
+ for _, name, value in self._full_items:
+ yield name, value
+
+ def __bool__(self):
+ return bool(self._full_items)
+
+ def __eq__(self, other):
+ return list(self) == list(other)
+
+ def __len__(self):
+ return len(self._full_items)
+
+ def __repr__(self):
+ return "" % repr(list(self))
+
+ def __getitem__(self, idx):
+ _, name, value = self._full_items[idx]
+ return (name, value)
+
+ def raw_items(self):
+ return [(raw_name, value) for raw_name, _, value in self._full_items]
+
+
+def normalize_and_validate(headers, _parsed=False):
+ new_headers = []
+ seen_content_length = None
+ saw_transfer_encoding = False
+ for name, value in headers:
+ # For headers coming out of the parser, we can safely skip some steps,
+ # because it always returns bytes and has already run these regexes
+ # over the data:
+ if not _parsed:
+ name = bytesify(name)
+ value = bytesify(value)
+ validate(_field_name_re, name, "Illegal header name {!r}", name)
+ validate(_field_value_re, value, "Illegal header value {!r}", value)
+ raw_name = name
+ name = name.lower()
+ if name == b"content-length":
+ lengths = {length.strip() for length in value.split(b",")}
+ if len(lengths) != 1:
+ raise LocalProtocolError("conflicting Content-Length headers")
+ value = lengths.pop()
+ validate(_content_length_re, value, "bad Content-Length")
+ if seen_content_length is None:
+ seen_content_length = value
+ new_headers.append((raw_name, name, value))
+ elif seen_content_length != value:
+ raise LocalProtocolError("conflicting Content-Length headers")
+ elif name == b"transfer-encoding":
+ # "A server that receives a request message with a transfer coding
+ # it does not understand SHOULD respond with 501 (Not
+ # Implemented)."
+ # https://tools.ietf.org/html/rfc7230#section-3.3.1
+ if saw_transfer_encoding:
+ raise LocalProtocolError(
+ "multiple Transfer-Encoding headers", error_status_hint=501
+ )
+ # "All transfer-coding names are case-insensitive"
+ # -- https://tools.ietf.org/html/rfc7230#section-4
+ value = value.lower()
+ if value != b"chunked":
+ raise LocalProtocolError(
+ "Only Transfer-Encoding: chunked is supported",
+ error_status_hint=501,
+ )
+ saw_transfer_encoding = True
+ new_headers.append((raw_name, name, value))
+ else:
+ new_headers.append((raw_name, name, value))
+ return Headers(new_headers)
+
+
+def get_comma_header(headers, name):
+ # Should only be used for headers whose value is a list of
+ # comma-separated, case-insensitive values.
+ #
+ # The header name `name` is expected to be lower-case bytes.
+ #
+ # Connection: meets these criteria (including cast insensitivity).
+ #
+ # Content-Length: technically is just a single value (1*DIGIT), but the
+ # standard makes reference to implementations that do multiple values, and
+ # using this doesn't hurt. Ditto, case insensitivity doesn't things either
+ # way.
+ #
+ # Transfer-Encoding: is more complex (allows for quoted strings), so
+ # splitting on , is actually wrong. For example, this is legal:
+ #
+ # Transfer-Encoding: foo; options="1,2", chunked
+ #
+ # and should be parsed as
+ #
+ # foo; options="1,2"
+ # chunked
+ #
+ # but this naive function will parse it as
+ #
+ # foo; options="1
+ # 2"
+ # chunked
+ #
+ # However, this is okay because the only thing we are going to do with
+ # any Transfer-Encoding is reject ones that aren't just "chunked", so
+ # both of these will be treated the same anyway.
+ #
+ # Expect: the only legal value is the literal string
+ # "100-continue". Splitting on commas is harmless. Case insensitive.
+ #
+ out = []
+ for _, found_name, found_raw_value in headers._full_items:
+ if found_name == name:
+ found_raw_value = found_raw_value.lower()
+ for found_split_value in found_raw_value.split(b","):
+ found_split_value = found_split_value.strip()
+ if found_split_value:
+ out.append(found_split_value)
+ return out
+
+
+def set_comma_header(headers, name, new_values):
+ # The header name `name` is expected to be lower-case bytes.
+ #
+ # Note that when we store the header we use title casing for the header
+ # names, in order to match the conventional HTTP header style.
+ #
+ # Simply calling `.title()` is a blunt approach, but it's correct
+ # here given the cases where we're using `set_comma_header`...
+ #
+ # Connection, Content-Length, Transfer-Encoding.
+ new_headers = []
+ for found_raw_name, found_name, found_raw_value in headers._full_items:
+ if found_name != name:
+ new_headers.append((found_raw_name, found_raw_value))
+ for new_value in new_values:
+ new_headers.append((name.title(), new_value))
+ return normalize_and_validate(new_headers)
+
+
+def has_expect_100_continue(request):
+ # https://tools.ietf.org/html/rfc7231#section-5.1.1
+ # "A server that receives a 100-continue expectation in an HTTP/1.0 request
+ # MUST ignore that expectation."
+ if request.http_version < b"1.1":
+ return False
+ expect = get_comma_header(request.headers, b"expect")
+ return b"100-continue" in expect
diff --git a/.venv/lib/python3.9/site-packages/h11/_readers.py b/.venv/lib/python3.9/site-packages/h11/_readers.py
new file mode 100644
index 0000000..0ead0be
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/_readers.py
@@ -0,0 +1,222 @@
+# Code to read HTTP data
+#
+# Strategy: each reader is a callable which takes a ReceiveBuffer object, and
+# either:
+# 1) consumes some of it and returns an Event
+# 2) raises a LocalProtocolError (for consistency -- e.g. we call validate()
+# and it might raise a LocalProtocolError, so simpler just to always use
+# this)
+# 3) returns None, meaning "I need more data"
+#
+# If they have a .read_eof attribute, then this will be called if an EOF is
+# received -- but this is optional. Either way, the actual ConnectionClosed
+# event will be generated afterwards.
+#
+# READERS is a dict describing how to pick a reader. It maps states to either:
+# - a reader
+# - or, for body readers, a dict of per-framing reader factories
+
+import re
+
+from ._abnf import chunk_header, header_field, request_line, status_line
+from ._events import *
+from ._state import *
+from ._util import LocalProtocolError, RemoteProtocolError, validate
+
+__all__ = ["READERS"]
+
+header_field_re = re.compile(header_field.encode("ascii"))
+
+# Remember that this has to run in O(n) time -- so e.g. the bytearray cast is
+# critical.
+obs_fold_re = re.compile(br"[ \t]+")
+
+
+def _obsolete_line_fold(lines):
+ it = iter(lines)
+ last = None
+ for line in it:
+ match = obs_fold_re.match(line)
+ if match:
+ if last is None:
+ raise LocalProtocolError("continuation line at start of headers")
+ if not isinstance(last, bytearray):
+ last = bytearray(last)
+ last += b" "
+ last += line[match.end() :]
+ else:
+ if last is not None:
+ yield last
+ last = line
+ if last is not None:
+ yield last
+
+
+def _decode_header_lines(lines):
+ for line in _obsolete_line_fold(lines):
+ matches = validate(header_field_re, line, "illegal header line: {!r}", line)
+ yield (matches["field_name"], matches["field_value"])
+
+
+request_line_re = re.compile(request_line.encode("ascii"))
+
+
+def maybe_read_from_IDLE_client(buf):
+ lines = buf.maybe_extract_lines()
+ if lines is None:
+ if buf.is_next_line_obviously_invalid_request_line():
+ raise LocalProtocolError("illegal request line")
+ return None
+ if not lines:
+ raise LocalProtocolError("no request line received")
+ matches = validate(
+ request_line_re, lines[0], "illegal request line: {!r}", lines[0]
+ )
+ return Request(
+ headers=list(_decode_header_lines(lines[1:])), _parsed=True, **matches
+ )
+
+
+status_line_re = re.compile(status_line.encode("ascii"))
+
+
+def maybe_read_from_SEND_RESPONSE_server(buf):
+ lines = buf.maybe_extract_lines()
+ if lines is None:
+ if buf.is_next_line_obviously_invalid_request_line():
+ raise LocalProtocolError("illegal request line")
+ return None
+ if not lines:
+ raise LocalProtocolError("no response line received")
+ matches = validate(status_line_re, lines[0], "illegal status line: {!r}", lines[0])
+ # Tolerate missing reason phrases
+ if matches["reason"] is None:
+ matches["reason"] = b""
+ status_code = matches["status_code"] = int(matches["status_code"])
+ class_ = InformationalResponse if status_code < 200 else Response
+ return class_(
+ headers=list(_decode_header_lines(lines[1:])), _parsed=True, **matches
+ )
+
+
+class ContentLengthReader:
+ def __init__(self, length):
+ self._length = length
+ self._remaining = length
+
+ def __call__(self, buf):
+ if self._remaining == 0:
+ return EndOfMessage()
+ data = buf.maybe_extract_at_most(self._remaining)
+ if data is None:
+ return None
+ self._remaining -= len(data)
+ return Data(data=data)
+
+ def read_eof(self):
+ raise RemoteProtocolError(
+ "peer closed connection without sending complete message body "
+ "(received {} bytes, expected {})".format(
+ self._length - self._remaining, self._length
+ )
+ )
+
+
+chunk_header_re = re.compile(chunk_header.encode("ascii"))
+
+
+class ChunkedReader:
+ def __init__(self):
+ self._bytes_in_chunk = 0
+ # After reading a chunk, we have to throw away the trailing \r\n; if
+ # this is >0 then we discard that many bytes before resuming regular
+ # de-chunkification.
+ self._bytes_to_discard = 0
+ self._reading_trailer = False
+
+ def __call__(self, buf):
+ if self._reading_trailer:
+ lines = buf.maybe_extract_lines()
+ if lines is None:
+ return None
+ return EndOfMessage(headers=list(_decode_header_lines(lines)))
+ if self._bytes_to_discard > 0:
+ data = buf.maybe_extract_at_most(self._bytes_to_discard)
+ if data is None:
+ return None
+ self._bytes_to_discard -= len(data)
+ if self._bytes_to_discard > 0:
+ return None
+ # else, fall through and read some more
+ assert self._bytes_to_discard == 0
+ if self._bytes_in_chunk == 0:
+ # We need to refill our chunk count
+ chunk_header = buf.maybe_extract_next_line()
+ if chunk_header is None:
+ return None
+ matches = validate(
+ chunk_header_re,
+ chunk_header,
+ "illegal chunk header: {!r}",
+ chunk_header,
+ )
+ # XX FIXME: we discard chunk extensions. Does anyone care?
+ self._bytes_in_chunk = int(matches["chunk_size"], base=16)
+ if self._bytes_in_chunk == 0:
+ self._reading_trailer = True
+ return self(buf)
+ chunk_start = True
+ else:
+ chunk_start = False
+ assert self._bytes_in_chunk > 0
+ data = buf.maybe_extract_at_most(self._bytes_in_chunk)
+ if data is None:
+ return None
+ self._bytes_in_chunk -= len(data)
+ if self._bytes_in_chunk == 0:
+ self._bytes_to_discard = 2
+ chunk_end = True
+ else:
+ chunk_end = False
+ return Data(data=data, chunk_start=chunk_start, chunk_end=chunk_end)
+
+ def read_eof(self):
+ raise RemoteProtocolError(
+ "peer closed connection without sending complete message body "
+ "(incomplete chunked read)"
+ )
+
+
+class Http10Reader:
+ def __call__(self, buf):
+ data = buf.maybe_extract_at_most(999999999)
+ if data is None:
+ return None
+ return Data(data=data)
+
+ def read_eof(self):
+ return EndOfMessage()
+
+
+def expect_nothing(buf):
+ if buf:
+ raise LocalProtocolError("Got data when expecting EOF")
+ return None
+
+
+READERS = {
+ (CLIENT, IDLE): maybe_read_from_IDLE_client,
+ (SERVER, IDLE): maybe_read_from_SEND_RESPONSE_server,
+ (SERVER, SEND_RESPONSE): maybe_read_from_SEND_RESPONSE_server,
+ (CLIENT, DONE): expect_nothing,
+ (CLIENT, MUST_CLOSE): expect_nothing,
+ (CLIENT, CLOSED): expect_nothing,
+ (SERVER, DONE): expect_nothing,
+ (SERVER, MUST_CLOSE): expect_nothing,
+ (SERVER, CLOSED): expect_nothing,
+ SEND_BODY: {
+ "chunked": ChunkedReader,
+ "content-length": ContentLengthReader,
+ "http/1.0": Http10Reader,
+ },
+}
diff --git a/.venv/lib/python3.9/site-packages/h11/_receivebuffer.py b/.venv/lib/python3.9/site-packages/h11/_receivebuffer.py
new file mode 100644
index 0000000..a3737f3
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/_receivebuffer.py
@@ -0,0 +1,152 @@
+import re
+import sys
+
+__all__ = ["ReceiveBuffer"]
+
+
+# Operations we want to support:
+# - find next \r\n or \r\n\r\n (\n or \n\n are also acceptable),
+# or wait until there is one
+# - read at-most-N bytes
+# Goals:
+# - on average, do this fast
+# - worst case, do this in O(n) where n is the number of bytes processed
+# Plan:
+# - store bytearray, offset, how far we've searched for a separator token
+# - use the how-far-we've-searched data to avoid rescanning
+# - while doing a stream of uninterrupted processing, advance offset instead
+# of constantly copying
+# WARNING:
+# - I haven't benchmarked or profiled any of this yet.
+#
+# Note that starting in Python 3.4, deleting the initial n bytes from a
+# bytearray is amortized O(n), thanks to some excellent work by Antoine
+# Martin:
+#
+# https://bugs.python.org/issue19087
+#
+# This means that if we only supported 3.4+, we could get rid of the code here
+# involving self._start and self.compress, because it's doing exactly the same
+# thing that bytearray now does internally.
+#
+# BUT unfortunately, we still support 2.7, and reading short segments out of a
+# long buffer MUST be O(bytes read) to avoid DoS issues, so we can't actually
+# delete this code. Yet:
+#
+# https://pythonclock.org/
+#
+# (Two things to double-check first though: make sure PyPy also has the
+# optimization, and benchmark to make sure it's a win, since we do have a
+# slightly clever thing where we delay calling compress() until we've
+# processed a whole event, which could in theory be slightly more efficient
+# than the internal bytearray support.)
+blank_line_regex = re.compile(b"\n\r?\n", re.MULTILINE)
+
+
+class ReceiveBuffer:
+ def __init__(self):
+ self._data = bytearray()
+ self._next_line_search = 0
+ self._multiple_lines_search = 0
+
+ def __iadd__(self, byteslike):
+ self._data += byteslike
+ return self
+
+ def __bool__(self):
+ return bool(len(self))
+
+ def __len__(self):
+ return len(self._data)
+
+ # for @property unprocessed_data
+ def __bytes__(self):
+ return bytes(self._data)
+
+ def _extract(self, count):
+ # extracting an initial slice of the data buffer and return it
+ out = self._data[:count]
+ del self._data[:count]
+
+ self._next_line_search = 0
+ self._multiple_lines_search = 0
+
+ return out
+
+ def maybe_extract_at_most(self, count):
+ """
+ Extract a fixed number of bytes from the buffer.
+ """
+ out = self._data[:count]
+ if not out:
+ return None
+
+ return self._extract(count)
+
+ def maybe_extract_next_line(self):
+ """
+ Extract the first line, if it is completed in the buffer.
+ """
+ # Only search in buffer space that we've not already looked at.
+ search_start_index = max(0, self._next_line_search - 1)
+ partial_idx = self._data.find(b"\r\n", search_start_index)
+
+ if partial_idx == -1:
+ self._next_line_search = len(self._data)
+ return None
+
+ # + 2 is to compensate len(b"\r\n")
+ idx = partial_idx + 2
+
+ return self._extract(idx)
+
+ def maybe_extract_lines(self):
+ """
+ Extract everything up to the first blank line, and return a list of lines.
+ """
+ # Handle the case where we have an immediate empty line.
+ if self._data[:1] == b"\n":
+ self._extract(1)
+ return []
+
+ if self._data[:2] == b"\r\n":
+ self._extract(2)
+ return []
+
+ # Only search in buffer space that we've not already looked at.
+ match = blank_line_regex.search(self._data, self._multiple_lines_search)
+ if match is None:
+ self._multiple_lines_search = max(0, len(self._data) - 2)
+ return None
+
+ # Truncate the buffer and return it.
+ idx = match.span(0)[-1]
+ out = self._extract(idx)
+ lines = out.split(b"\n")
+
+ for line in lines:
+ if line.endswith(b"\r"):
+ del line[-1]
+
+ assert lines[-2] == lines[-1] == b""
+
+ del lines[-2:]
+
+ return lines
+
+ # In theory we should wait until `\r\n` before starting to validate
+ # incoming data. However it's interesting to detect (very) invalid data
+ # early given they might not even contain `\r\n` at all (hence only
+ # timeout will get rid of them).
+ # This is not a 100% effective detection but more of a cheap sanity check
+ # allowing for early abort in some useful cases.
+ # This is especially interesting when peer is messing up with HTTPS and
+ # sent us a TLS stream where we were expecting plain HTTP given all
+ # versions of TLS so far start handshake with a 0x16 message type code.
+ def is_next_line_obviously_invalid_request_line(self):
+ try:
+ # HTTP header line must not contain non-printable characters
+ # and should not start with a space
+ return self._data[0] < 0x21
+ except IndexError:
+ return False
diff --git a/.venv/lib/python3.9/site-packages/h11/_state.py b/.venv/lib/python3.9/site-packages/h11/_state.py
new file mode 100644
index 0000000..0f08a09
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/_state.py
@@ -0,0 +1,307 @@
+################################################################
+# The core state machine
+################################################################
+#
+# Rule 1: everything that affects the state machine and state transitions must
+# live here in this file. As much as possible goes into the table-based
+# representation, but for the bits that don't quite fit, the actual code and
+# state must nonetheless live here.
+#
+# Rule 2: this file does not know about what role we're playing; it only knows
+# about HTTP request/response cycles in the abstract. This ensures that we
+# don't cheat and apply different rules to local and remote parties.
+#
+#
+# Theory of operation
+# ===================
+#
+# Possibly the simplest way to think about this is that we actually have 5
+# different state machines here. Yes, 5. These are:
+#
+# 1) The client state, with its complicated automaton (see the docs)
+# 2) The server state, with its complicated automaton (see the docs)
+# 3) The keep-alive state, with possible states {True, False}
+# 4) The SWITCH_CONNECT state, with possible states {False, True}
+# 5) The SWITCH_UPGRADE state, with possible states {False, True}
+#
+# For (3)-(5), the first state listed is the initial state.
+#
+# (1)-(3) are stored explicitly in member variables. The last
+# two are stored implicitly in the pending_switch_proposals set as:
+# (state of 4) == (_SWITCH_CONNECT in pending_switch_proposals)
+# (state of 5) == (_SWITCH_UPGRADE in pending_switch_proposals)
+#
+# And each of these machines has two different kinds of transitions:
+#
+# a) Event-triggered
+# b) State-triggered
+#
+# Event triggered is the obvious thing that you'd think it is: some event
+# happens, and if it's the right event at the right time then a transition
+# happens. But there are somewhat complicated rules for which machines can
+# "see" which events. (As a rule of thumb, if a machine "sees" an event, this
+# means two things: the event can affect the machine, and if the machine is
+# not in a state where it expects that event then it's an error.) These rules
+# are:
+#
+# 1) The client machine sees all h11.events objects emitted by the client.
+#
+# 2) The server machine sees all h11.events objects emitted by the server.
+#
+# It also sees the client's Request event.
+#
+# And sometimes, server events are annotated with a _SWITCH_* event. For
+# example, we can have a (Response, _SWITCH_CONNECT) event, which is
+# different from a regular Response event.
+#
+# 3) The keep-alive machine sees the process_keep_alive_disabled() event
+# (which is derived from Request/Response events), and this event
+# transitions it from True -> False, or from False -> False. There's no way
+# to transition back.
+#
+# 4&5) The _SWITCH_* machines transition from False->True when we get a
+# Request that proposes the relevant type of switch (via
+# process_client_switch_proposals), and they go from True->False when we
+# get a Response that has no _SWITCH_* annotation.
+#
+# So that's event-triggered transitions.
+#
+# State-triggered transitions are less standard. What they do here is couple
+# the machines together. The way this works is, when certain *joint*
+# configurations of states are achieved, then we automatically transition to a
+# new *joint* state. So, for example, if we're ever in a joint state with
+#
+# client: DONE
+# keep-alive: False
+#
+# then the client state immediately transitions to:
+#
+# client: MUST_CLOSE
+#
+# This is fundamentally different from an event-based transition, because it
+# doesn't matter how we arrived at the {client: DONE, keep-alive: False} state
+# -- maybe the client transitioned SEND_BODY -> DONE, or keep-alive
+# transitioned True -> False. Either way, once this precondition is satisfied,
+# this transition is immediately triggered.
+#
+# What if two conflicting state-based transitions get enabled at the same
+# time? In practice there's only one case where this arises (client DONE ->
+# MIGHT_SWITCH_PROTOCOL versus DONE -> MUST_CLOSE), and we resolve it by
+# explicitly prioritizing the DONE -> MIGHT_SWITCH_PROTOCOL transition.
+#
+# Implementation
+# --------------
+#
+# The event-triggered transitions for the server and client machines are all
+# stored explicitly in a table. Ditto for the state-triggered transitions that
+# involve just the server and client state.
+#
+# The transitions for the other machines, and the state-triggered transitions
+# that involve the other machines, are written out as explicit Python code.
+#
+# It'd be nice if there were some cleaner way to do all this. This isn't
+# *too* terrible, but I feel like it could probably be better.
+#
+# WARNING
+# -------
+#
+# The script that generates the state machine diagrams for the docs knows how
+# to read out the EVENT_TRIGGERED_TRANSITIONS and STATE_TRIGGERED_TRANSITIONS
+# tables. But it can't automatically read the transitions that are written
+# directly in Python code. So if you touch those, you need to also update the
+# script to keep it in sync!
+
+from ._events import *
+from ._util import LocalProtocolError, make_sentinel
+
+# Everything in __all__ gets re-exported as part of the h11 public API.
+__all__ = [
+ "CLIENT",
+ "SERVER",
+ "IDLE",
+ "SEND_RESPONSE",
+ "SEND_BODY",
+ "DONE",
+ "MUST_CLOSE",
+ "CLOSED",
+ "MIGHT_SWITCH_PROTOCOL",
+ "SWITCHED_PROTOCOL",
+ "ERROR",
+]
+
+CLIENT = make_sentinel("CLIENT")
+SERVER = make_sentinel("SERVER")
+
+# States
+IDLE = make_sentinel("IDLE")
+SEND_RESPONSE = make_sentinel("SEND_RESPONSE")
+SEND_BODY = make_sentinel("SEND_BODY")
+DONE = make_sentinel("DONE")
+MUST_CLOSE = make_sentinel("MUST_CLOSE")
+CLOSED = make_sentinel("CLOSED")
+ERROR = make_sentinel("ERROR")
+
+# Switch types
+MIGHT_SWITCH_PROTOCOL = make_sentinel("MIGHT_SWITCH_PROTOCOL")
+SWITCHED_PROTOCOL = make_sentinel("SWITCHED_PROTOCOL")
+
+_SWITCH_UPGRADE = make_sentinel("_SWITCH_UPGRADE")
+_SWITCH_CONNECT = make_sentinel("_SWITCH_CONNECT")
+
+EVENT_TRIGGERED_TRANSITIONS = {
+ CLIENT: {
+ IDLE: {Request: SEND_BODY, ConnectionClosed: CLOSED},
+ SEND_BODY: {Data: SEND_BODY, EndOfMessage: DONE},
+ DONE: {ConnectionClosed: CLOSED},
+ MUST_CLOSE: {ConnectionClosed: CLOSED},
+ CLOSED: {ConnectionClosed: CLOSED},
+ MIGHT_SWITCH_PROTOCOL: {},
+ SWITCHED_PROTOCOL: {},
+ ERROR: {},
+ },
+ SERVER: {
+ IDLE: {
+ ConnectionClosed: CLOSED,
+ Response: SEND_BODY,
+ # Special case: server sees client Request events, in this form
+ (Request, CLIENT): SEND_RESPONSE,
+ },
+ SEND_RESPONSE: {
+ InformationalResponse: SEND_RESPONSE,
+ Response: SEND_BODY,
+ (InformationalResponse, _SWITCH_UPGRADE): SWITCHED_PROTOCOL,
+ (Response, _SWITCH_CONNECT): SWITCHED_PROTOCOL,
+ },
+ SEND_BODY: {Data: SEND_BODY, EndOfMessage: DONE},
+ DONE: {ConnectionClosed: CLOSED},
+ MUST_CLOSE: {ConnectionClosed: CLOSED},
+ CLOSED: {ConnectionClosed: CLOSED},
+ SWITCHED_PROTOCOL: {},
+ ERROR: {},
+ },
+}
+
+# NB: there are also some special-case state-triggered transitions hard-coded
+# into _fire_state_triggered_transitions below.
+STATE_TRIGGERED_TRANSITIONS = {
+ # (Client state, Server state) -> new states
+ # Protocol negotiation
+ (MIGHT_SWITCH_PROTOCOL, SWITCHED_PROTOCOL): {CLIENT: SWITCHED_PROTOCOL},
+ # Socket shutdown
+ (CLOSED, DONE): {SERVER: MUST_CLOSE},
+ (CLOSED, IDLE): {SERVER: MUST_CLOSE},
+ (ERROR, DONE): {SERVER: MUST_CLOSE},
+ (DONE, CLOSED): {CLIENT: MUST_CLOSE},
+ (IDLE, CLOSED): {CLIENT: MUST_CLOSE},
+ (DONE, ERROR): {CLIENT: MUST_CLOSE},
+}
+
+
+class ConnectionState:
+ def __init__(self):
+ # Extra bits of state that don't quite fit into the state model.
+
+ # If this is False then it enables the automatic DONE -> MUST_CLOSE
+ # transition. Don't set this directly; call .keep_alive_disabled()
+ self.keep_alive = True
+
+ # This is a subset of {UPGRADE, CONNECT}, containing the proposals
+ # made by the client for switching protocols.
+ self.pending_switch_proposals = set()
+
+ self.states = {CLIENT: IDLE, SERVER: IDLE}
+
+ def process_error(self, role):
+ self.states[role] = ERROR
+ self._fire_state_triggered_transitions()
+
+ def process_keep_alive_disabled(self):
+ self.keep_alive = False
+ self._fire_state_triggered_transitions()
+
+ def process_client_switch_proposal(self, switch_event):
+ self.pending_switch_proposals.add(switch_event)
+ self._fire_state_triggered_transitions()
+
+ def process_event(self, role, event_type, server_switch_event=None):
+ if server_switch_event is not None:
+ assert role is SERVER
+ if server_switch_event not in self.pending_switch_proposals:
+ raise LocalProtocolError(
+ "Received server {} event without a pending proposal".format(
+ server_switch_event
+ )
+ )
+ event_type = (event_type, server_switch_event)
+ if server_switch_event is None and event_type is Response:
+ self.pending_switch_proposals = set()
+ self._fire_event_triggered_transitions(role, event_type)
+ # Special case: the server state does get to see Request
+ # events.
+ if event_type is Request:
+ assert role is CLIENT
+ self._fire_event_triggered_transitions(SERVER, (Request, CLIENT))
+ self._fire_state_triggered_transitions()
+
+ def _fire_event_triggered_transitions(self, role, event_type):
+ state = self.states[role]
+ try:
+ new_state = EVENT_TRIGGERED_TRANSITIONS[role][state][event_type]
+ except KeyError:
+ raise LocalProtocolError(
+ "can't handle event type {} when role={} and state={}".format(
+ event_type.__name__, role, self.states[role]
+ )
+ )
+ self.states[role] = new_state
+
+ def _fire_state_triggered_transitions(self):
+ # We apply these rules repeatedly until converging on a fixed point
+ while True:
+ start_states = dict(self.states)
+
+ # It could happen that both these special-case transitions are
+ # enabled at the same time:
+ #
+ # DONE -> MIGHT_SWITCH_PROTOCOL
+ # DONE -> MUST_CLOSE
+ #
+ # For example, this will always be true of a HTTP/1.0 client
+ # requesting CONNECT. If this happens, the protocol switch takes
+ # priority. From there the client will either go to
+ # SWITCHED_PROTOCOL, in which case it's none of our business when
+ # they close the connection, or else the server will deny the
+ # request, in which case the client will go back to DONE and then
+ # from there to MUST_CLOSE.
+ if self.pending_switch_proposals:
+ if self.states[CLIENT] is DONE:
+ self.states[CLIENT] = MIGHT_SWITCH_PROTOCOL
+
+ if not self.pending_switch_proposals:
+ if self.states[CLIENT] is MIGHT_SWITCH_PROTOCOL:
+ self.states[CLIENT] = DONE
+
+ if not self.keep_alive:
+ for role in (CLIENT, SERVER):
+ if self.states[role] is DONE:
+ self.states[role] = MUST_CLOSE
+
+ # Tabular state-triggered transitions
+ joint_state = (self.states[CLIENT], self.states[SERVER])
+ changes = STATE_TRIGGERED_TRANSITIONS.get(joint_state, {})
+ self.states.update(changes)
+
+ if self.states == start_states:
+ # Fixed point reached
+ return
+
+ def start_next_cycle(self):
+ if self.states != {CLIENT: DONE, SERVER: DONE}:
+ raise LocalProtocolError(
+ "not in a reusable state. self.states={}".format(self.states)
+ )
+ # Can't reach DONE/DONE with any of these active, but still, let's be
+ # sure.
+ assert self.keep_alive
+ assert not self.pending_switch_proposals
+ self.states = {CLIENT: IDLE, SERVER: IDLE}
diff --git a/.venv/lib/python3.9/site-packages/h11/_util.py b/.venv/lib/python3.9/site-packages/h11/_util.py
new file mode 100644
index 0000000..eb1a5cd
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/_util.py
@@ -0,0 +1,122 @@
+__all__ = [
+ "ProtocolError",
+ "LocalProtocolError",
+ "RemoteProtocolError",
+ "validate",
+ "make_sentinel",
+ "bytesify",
+]
+
+
+class ProtocolError(Exception):
+ """Exception indicating a violation of the HTTP/1.1 protocol.
+
+ This as an abstract base class, with two concrete base classes:
+ :exc:`LocalProtocolError`, which indicates that you tried to do something
+ that HTTP/1.1 says is illegal, and :exc:`RemoteProtocolError`, which
+ indicates that the remote peer tried to do something that HTTP/1.1 says is
+ illegal. See :ref:`error-handling` for details.
+
+ In addition to the normal :exc:`Exception` features, it has one attribute:
+
+ .. attribute:: error_status_hint
+
+ This gives a suggestion as to what status code a server might use if
+ this error occurred as part of a request.
+
+ For a :exc:`RemoteProtocolError`, this is useful as a suggestion for
+ how you might want to respond to a misbehaving peer, if you're
+ implementing a server.
+
+ For a :exc:`LocalProtocolError`, this can be taken as a suggestion for
+ how your peer might have responded to *you* if h11 had allowed you to
+ continue.
+
+ The default is 400 Bad Request, a generic catch-all for protocol
+ violations.
+
+ """
+
+ def __init__(self, msg, error_status_hint=400):
+ if type(self) is ProtocolError:
+ raise TypeError("tried to directly instantiate ProtocolError")
+ Exception.__init__(self, msg)
+ self.error_status_hint = error_status_hint
+
+
+# Strategy: there are a number of public APIs where a LocalProtocolError can
+# be raised (send(), all the different event constructors, ...), and only one
+# public API where RemoteProtocolError can be raised
+# (receive_data()). Therefore we always raise LocalProtocolError internally,
+# and then receive_data will translate this into a RemoteProtocolError.
+#
+# Internally:
+# LocalProtocolError is the generic "ProtocolError".
+# Externally:
+# LocalProtocolError is for local errors and RemoteProtocolError is for
+# remote errors.
+class LocalProtocolError(ProtocolError):
+ def _reraise_as_remote_protocol_error(self):
+ # After catching a LocalProtocolError, use this method to re-raise it
+ # as a RemoteProtocolError. This method must be called from inside an
+ # except: block.
+ #
+ # An easy way to get an equivalent RemoteProtocolError is just to
+ # modify 'self' in place.
+ self.__class__ = RemoteProtocolError
+ # But the re-raising is somewhat non-trivial -- you might think that
+ # now that we've modified the in-flight exception object, that just
+ # doing 'raise' to re-raise it would be enough. But it turns out that
+ # this doesn't work, because Python tracks the exception type
+ # (exc_info[0]) separately from the exception object (exc_info[1]),
+ # and we only modified the latter. So we really do need to re-raise
+ # the new type explicitly.
+ # On py3, the traceback is part of the exception object, so our
+ # in-place modification preserved it and we can just re-raise:
+ raise self
+
+
+class RemoteProtocolError(ProtocolError):
+ pass
+
+
+def validate(regex, data, msg="malformed data", *format_args):
+ match = regex.fullmatch(data)
+ if not match:
+ if format_args:
+ msg = msg.format(*format_args)
+ raise LocalProtocolError(msg)
+ return match.groupdict()
+
+
+# Sentinel values
+#
+# - Inherit identity-based comparison and hashing from object
+# - Have a nice repr
+# - Have a *bonus property*: type(sentinel) is sentinel
+#
+# The bonus property is useful if you want to take the return value from
+# next_event() and do some sort of dispatch based on type(event).
+class _SentinelBase(type):
+ def __repr__(self):
+ return self.__name__
+
+
+def make_sentinel(name):
+ cls = _SentinelBase(name, (_SentinelBase,), {})
+ cls.__class__ = cls
+ return cls
+
+
+# Used for methods, request targets, HTTP versions, header names, and header
+# values. Accepts ascii-strings, or bytes/bytearray/memoryview/..., and always
+# returns bytes.
+def bytesify(s):
+ # Fast-path:
+ if type(s) is bytes:
+ return s
+ if isinstance(s, str):
+ s = s.encode("ascii")
+ if isinstance(s, int):
+ raise TypeError("expected bytes-like object, not int")
+ return bytes(s)
diff --git a/.venv/lib/python3.9/site-packages/h11/_version.py b/.venv/lib/python3.9/site-packages/h11/_version.py
new file mode 100644
index 0000000..cb5c2c3
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/_version.py
@@ -0,0 +1,16 @@
+# This file must be kept very simple, because it is consumed from several
+# places -- it is imported by h11/__init__.py, execfile'd by setup.py, etc.
+
+# We use a simple scheme:
+# 1.0.0 -> 1.0.0+dev -> 1.1.0 -> 1.1.0+dev
+# where the +dev versions are never released into the wild, they're just what
+# we stick into the VCS in between releases.
+#
+# This is compatible with PEP 440:
+# http://legacy.python.org/dev/peps/pep-0440/
+# via the use of the "local suffix" "+dev", which is disallowed on index
+# servers and causes 1.0.0+dev to sort after plain 1.0.0, which is what we
+# want. (Contrast with the special suffix 1.0.0.dev, which sorts *before*
+# 1.0.0.)
+
+__version__ = "0.12.0"
diff --git a/.venv/lib/python3.9/site-packages/h11/_writers.py b/.venv/lib/python3.9/site-packages/h11/_writers.py
new file mode 100644
index 0000000..cb5e8a8
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/_writers.py
@@ -0,0 +1,123 @@
+# Code to read HTTP data
+#
+# Strategy: each writer takes an event + a write-some-bytes function, which is
+# calls.
+#
+# WRITERS is a dict describing how to pick a reader. It maps states to either:
+# - a writer
+# - or, for body writers, a dict of framin-dependent writer factories
+
+from ._events import Data, EndOfMessage
+from ._state import CLIENT, IDLE, SEND_BODY, SEND_RESPONSE, SERVER
+from ._util import LocalProtocolError
+
+__all__ = ["WRITERS"]
+
+
+def write_headers(headers, write):
+ # "Since the Host field-value is critical information for handling a
+ # request, a user agent SHOULD generate Host as the first header field
+ # following the request-line." - RFC 7230
+ raw_items = headers._full_items
+ for raw_name, name, value in raw_items:
+ if name == b"host":
+ write(b"%s: %s\r\n" % (raw_name, value))
+ for raw_name, name, value in raw_items:
+ if name != b"host":
+ write(b"%s: %s\r\n" % (raw_name, value))
+ write(b"\r\n")
+
+
+def write_request(request, write):
+ if request.http_version != b"1.1":
+ raise LocalProtocolError("I only send HTTP/1.1")
+ write(b"%s %s HTTP/1.1\r\n" % (request.method, request.target))
+ write_headers(request.headers, write)
+
+
+# Shared between InformationalResponse and Response
+def write_any_response(response, write):
+ if response.http_version != b"1.1":
+ raise LocalProtocolError("I only send HTTP/1.1")
+ status_bytes = str(response.status_code).encode("ascii")
+ # We don't bother sending ascii status messages like "OK"; they're
+ # optional and ignored by the protocol. (But the space after the numeric
+ # status code is mandatory.)
+ #
+ # XX FIXME: could at least make an effort to pull out the status message
+ # from stdlib's http.HTTPStatus table. Or maybe just steal their enums
+ # (either by import or copy/paste). We already accept them as status codes
+ # since they're of type IntEnum < int.
+ write(b"HTTP/1.1 %s %s\r\n" % (status_bytes, response.reason))
+ write_headers(response.headers, write)
+
+
+class BodyWriter:
+ def __call__(self, event, write):
+ if type(event) is Data:
+ self.send_data(event.data, write)
+ elif type(event) is EndOfMessage:
+ self.send_eom(event.headers, write)
+ else: # pragma: no cover
+ assert False
+
+
+#
+# These are all careful not to do anything to 'data' except call len(data) and
+# write(data). This allows us to transparently pass-through funny objects,
+# like placeholder objects referring to files on disk that will be sent via
+# sendfile(2).
+#
+class ContentLengthWriter(BodyWriter):
+ def __init__(self, length):
+ self._length = length
+
+ def send_data(self, data, write):
+ self._length -= len(data)
+ if self._length < 0:
+ raise LocalProtocolError("Too much data for declared Content-Length")
+ write(data)
+
+ def send_eom(self, headers, write):
+ if self._length != 0:
+ raise LocalProtocolError("Too little data for declared Content-Length")
+ if headers:
+ raise LocalProtocolError("Content-Length and trailers don't mix")
+
+
+class ChunkedWriter(BodyWriter):
+ def send_data(self, data, write):
+ # if we encoded 0-length data in the naive way, it would look like an
+ # end-of-message.
+ if not data:
+ return
+ write(b"%x\r\n" % len(data))
+ write(data)
+ write(b"\r\n")
+
+ def send_eom(self, headers, write):
+ write(b"0\r\n")
+ write_headers(headers, write)
+
+
+class Http10Writer(BodyWriter):
+ def send_data(self, data, write):
+ write(data)
+
+ def send_eom(self, headers, write):
+ if headers:
+ raise LocalProtocolError("can't send trailers to HTTP/1.0 client")
+ # no need to close the socket ourselves, that will be taken care of by
+ # Connection: close machinery
+
+
+WRITERS = {
+ (CLIENT, IDLE): write_request,
+ (SERVER, IDLE): write_any_response,
+ (SERVER, SEND_RESPONSE): write_any_response,
+ SEND_BODY: {
+ "chunked": ChunkedWriter,
+ "content-length": ContentLengthWriter,
+ "http/1.0": Http10Writer,
+ },
+}
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/__init__.py b/.venv/lib/python3.9/site-packages/h11/tests/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..cf90bfb
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/helpers.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/helpers.cpython-39.pyc
new file mode 100644
index 0000000..325170c
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/helpers.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_against_stdlib_http.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_against_stdlib_http.cpython-39.pyc
new file mode 100644
index 0000000..af06010
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_against_stdlib_http.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_connection.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_connection.cpython-39.pyc
new file mode 100644
index 0000000..a84ab46
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_connection.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_events.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_events.cpython-39.pyc
new file mode 100644
index 0000000..688a74f
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_events.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_headers.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_headers.cpython-39.pyc
new file mode 100644
index 0000000..3c990d3
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_headers.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_helpers.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_helpers.cpython-39.pyc
new file mode 100644
index 0000000..ed2edd3
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_helpers.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_io.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_io.cpython-39.pyc
new file mode 100644
index 0000000..d66bc1e
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_io.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_receivebuffer.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_receivebuffer.cpython-39.pyc
new file mode 100644
index 0000000..f70aa75
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_receivebuffer.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_state.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_state.cpython-39.pyc
new file mode 100644
index 0000000..3836256
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_state.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_util.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_util.cpython-39.pyc
new file mode 100644
index 0000000..4e2407a
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h11/tests/__pycache__/test_util.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/data/test-file b/.venv/lib/python3.9/site-packages/h11/tests/data/test-file
new file mode 100644
index 0000000..d0be0a6
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/tests/data/test-file
@@ -0,0 +1 @@
+92b12bc045050b55b848d37167a1a63947c364579889ce1d39788e45e9fac9e5
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/helpers.py b/.venv/lib/python3.9/site-packages/h11/tests/helpers.py
new file mode 100644
index 0000000..9d2cf38
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/tests/helpers.py
@@ -0,0 +1,77 @@
+from .._connection import *
+from .._events import *
+from .._state import *
+
+
+def get_all_events(conn):
+ got_events = []
+ while True:
+ event = conn.next_event()
+ if event in (NEED_DATA, PAUSED):
+ break
+ got_events.append(event)
+ if type(event) is ConnectionClosed:
+ break
+ return got_events
+
+
+def receive_and_get(conn, data):
+ conn.receive_data(data)
+ return get_all_events(conn)
+
+
+# Merges adjacent Data events, converts payloads to bytestrings, and removes
+# chunk boundaries.
+def normalize_data_events(in_events):
+ out_events = []
+ for event in in_events:
+ if type(event) is Data:
+ event.data = bytes(event.data)
+ event.chunk_start = False
+ event.chunk_end = False
+ if out_events and type(out_events[-1]) is type(event) is Data:
+ out_events[-1].data += event.data
+ else:
+ out_events.append(event)
+ return out_events
+
+
+# Given that we want to write tests that push some events through a Connection
+# and check that its state updates appropriately... we might as make a habit
+# of pushing them through two Connections with a fake network link in
+# between.
+class ConnectionPair:
+ def __init__(self):
+ self.conn = {CLIENT: Connection(CLIENT), SERVER: Connection(SERVER)}
+ self.other = {CLIENT: SERVER, SERVER: CLIENT}
+
+ @property
+ def conns(self):
+ return self.conn.values()
+
+ # expect="match" if expect=send_events; expect=[...] to say what expected
+ def send(self, role, send_events, expect="match"):
+ if not isinstance(send_events, list):
+ send_events = [send_events]
+ data = b""
+ closed = False
+ for send_event in send_events:
+ new_data = self.conn[role].send(send_event)
+ if new_data is None:
+ closed = True
+ else:
+ data += new_data
+ # send uses b"" to mean b"", and None to mean closed
+ # receive uses b"" to mean closed, and None to mean "try again"
+ # so we have to translate between the two conventions
+ if data:
+ self.conn[self.other[role]].receive_data(data)
+ if closed:
+ self.conn[self.other[role]].receive_data(b"")
+ got_events = get_all_events(self.conn[self.other[role]])
+ if expect == "match":
+ expect = send_events
+ if not isinstance(expect, list):
+ expect = [expect]
+ assert got_events == expect
+ return data
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/test_against_stdlib_http.py b/.venv/lib/python3.9/site-packages/h11/tests/test_against_stdlib_http.py
new file mode 100644
index 0000000..e6c5db4
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/tests/test_against_stdlib_http.py
@@ -0,0 +1,111 @@
+import json
+import os.path
+import socket
+import socketserver
+import threading
+from contextlib import closing, contextmanager
+from http.server import SimpleHTTPRequestHandler
+from urllib.request import urlopen
+
+import h11
+
+
+@contextmanager
+def socket_server(handler):
+ httpd = socketserver.TCPServer(("127.0.0.1", 0), handler)
+ thread = threading.Thread(
+ target=httpd.serve_forever, kwargs={"poll_interval": 0.01}
+ )
+ thread.daemon = True
+ try:
+ thread.start()
+ yield httpd
+ finally:
+ httpd.shutdown()
+
+
+test_file_path = os.path.join(os.path.dirname(__file__), "data/test-file")
+with open(test_file_path, "rb") as f:
+ test_file_data = f.read()
+
+
+class SingleMindedRequestHandler(SimpleHTTPRequestHandler):
+ def translate_path(self, path):
+ return test_file_path
+
+
+def test_h11_as_client():
+ with socket_server(SingleMindedRequestHandler) as httpd:
+ with closing(socket.create_connection(httpd.server_address)) as s:
+ c = h11.Connection(h11.CLIENT)
+
+ s.sendall(
+ c.send(
+ h11.Request(
+ method="GET", target="/foo", headers=[("Host", "localhost")]
+ )
+ )
+ )
+ s.sendall(c.send(h11.EndOfMessage()))
+
+ data = bytearray()
+ while True:
+ event = c.next_event()
+ print(event)
+ if event is h11.NEED_DATA:
+ # Use a small read buffer to make things more challenging
+ # and exercise more paths :-)
+ c.receive_data(s.recv(10))
+ continue
+ if type(event) is h11.Response:
+ assert event.status_code == 200
+ if type(event) is h11.Data:
+ data += event.data
+ if type(event) is h11.EndOfMessage:
+ break
+ assert bytes(data) == test_file_data
+
+
+class H11RequestHandler(socketserver.BaseRequestHandler):
+ def handle(self):
+ with closing(self.request) as s:
+ c = h11.Connection(h11.SERVER)
+ request = None
+ while True:
+ event = c.next_event()
+ if event is h11.NEED_DATA:
+ # Use a small read buffer to make things more challenging
+ # and exercise more paths :-)
+ c.receive_data(s.recv(10))
+ continue
+ if type(event) is h11.Request:
+ request = event
+ if type(event) is h11.EndOfMessage:
+ break
+ info = json.dumps(
+ {
+ "method": request.method.decode("ascii"),
+ "target": request.target.decode("ascii"),
+ "headers": {
+ name.decode("ascii"): value.decode("ascii")
+ for (name, value) in request.headers
+ },
+ }
+ )
+ s.sendall(c.send(h11.Response(status_code=200, headers=[])))
+ s.sendall(c.send(h11.Data(data=info.encode("ascii"))))
+ s.sendall(c.send(h11.EndOfMessage()))
+
+
+def test_h11_as_server():
+ with socket_server(H11RequestHandler) as httpd:
+ host, port = httpd.server_address
+ url = "http://{}:{}/some-path".format(host, port)
+ with closing(urlopen(url)) as f:
+ assert f.getcode() == 200
+ data = f.read()
+ info = json.loads(data.decode("ascii"))
+ print(info)
+ assert info["method"] == "GET"
+ assert info["target"] == "/some-path"
+ assert "urllib" in info["headers"]["user-agent"]
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/test_connection.py b/.venv/lib/python3.9/site-packages/h11/tests/test_connection.py
new file mode 100644
index 0000000..baadec8
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/tests/test_connection.py
@@ -0,0 +1,1078 @@
+import pytest
+
+from .._connection import _body_framing, _keep_alive, Connection, NEED_DATA, PAUSED
+from .._events import *
+from .._state import *
+from .._util import LocalProtocolError, RemoteProtocolError
+from .helpers import ConnectionPair, get_all_events, receive_and_get
+
+
+def test__keep_alive():
+ assert _keep_alive(
+ Request(method="GET", target="/", headers=[("Host", "Example.com")])
+ )
+ assert not _keep_alive(
+ Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "Example.com"), ("Connection", "close")],
+ )
+ )
+ assert not _keep_alive(
+ Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "Example.com"), ("Connection", "a, b, cLOse, foo")],
+ )
+ )
+ assert not _keep_alive(
+ Request(method="GET", target="/", headers=[], http_version="1.0")
+ )
+
+ assert _keep_alive(Response(status_code=200, headers=[]))
+ assert not _keep_alive(Response(status_code=200, headers=[("Connection", "close")]))
+ assert not _keep_alive(
+ Response(status_code=200, headers=[("Connection", "a, b, cLOse, foo")])
+ )
+ assert not _keep_alive(Response(status_code=200, headers=[], http_version="1.0"))
+
+
+def test__body_framing():
+ def headers(cl, te):
+ headers = []
+ if cl is not None:
+ headers.append(("Content-Length", str(cl)))
+ if te:
+ headers.append(("Transfer-Encoding", "chunked"))
+ return headers
+
+ def resp(status_code=200, cl=None, te=False):
+ return Response(status_code=status_code, headers=headers(cl, te))
+
+ def req(cl=None, te=False):
+ h = headers(cl, te)
+ h += [("Host", "example.com")]
+ return Request(method="GET", target="/", headers=h)
+
+ # Special cases where the headers are ignored:
+ for kwargs in [{}, {"cl": 100}, {"te": True}, {"cl": 100, "te": True}]:
+ for meth, r in [
+ (b"HEAD", resp(**kwargs)),
+ (b"GET", resp(status_code=204, **kwargs)),
+ (b"GET", resp(status_code=304, **kwargs)),
+ ]:
+ assert _body_framing(meth, r) == ("content-length", (0,))
+
+ # Transfer-encoding
+ for kwargs in [{"te": True}, {"cl": 100, "te": True}]:
+ for meth, r in [(None, req(**kwargs)), (b"GET", resp(**kwargs))]:
+ assert _body_framing(meth, r) == ("chunked", ())
+
+ # Content-Length
+ for meth, r in [(None, req(cl=100)), (b"GET", resp(cl=100))]:
+ assert _body_framing(meth, r) == ("content-length", (100,))
+
+ # No headers
+ assert _body_framing(None, req()) == ("content-length", (0,))
+ assert _body_framing(b"GET", resp()) == ("http/1.0", ())
+
+
+def test_Connection_basics_and_content_length():
+ with pytest.raises(ValueError):
+ Connection("CLIENT")
+
+ p = ConnectionPair()
+ assert p.conn[CLIENT].our_role is CLIENT
+ assert p.conn[CLIENT].their_role is SERVER
+ assert p.conn[SERVER].our_role is SERVER
+ assert p.conn[SERVER].their_role is CLIENT
+
+ data = p.send(
+ CLIENT,
+ Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "example.com"), ("Content-Length", "10")],
+ ),
+ )
+ assert data == (
+ b"GET / HTTP/1.1\r\n" b"Host: example.com\r\n" b"Content-Length: 10\r\n\r\n"
+ )
+
+ for conn in p.conns:
+ assert conn.states == {CLIENT: SEND_BODY, SERVER: SEND_RESPONSE}
+ assert p.conn[CLIENT].our_state is SEND_BODY
+ assert p.conn[CLIENT].their_state is SEND_RESPONSE
+ assert p.conn[SERVER].our_state is SEND_RESPONSE
+ assert p.conn[SERVER].their_state is SEND_BODY
+
+ assert p.conn[CLIENT].their_http_version is None
+ assert p.conn[SERVER].their_http_version == b"1.1"
+
+ data = p.send(SERVER, InformationalResponse(status_code=100, headers=[]))
+ assert data == b"HTTP/1.1 100 \r\n\r\n"
+
+ data = p.send(SERVER, Response(status_code=200, headers=[("Content-Length", "11")]))
+ assert data == b"HTTP/1.1 200 \r\nContent-Length: 11\r\n\r\n"
+
+ for conn in p.conns:
+ assert conn.states == {CLIENT: SEND_BODY, SERVER: SEND_BODY}
+
+ assert p.conn[CLIENT].their_http_version == b"1.1"
+ assert p.conn[SERVER].their_http_version == b"1.1"
+
+ data = p.send(CLIENT, Data(data=b"12345"))
+ assert data == b"12345"
+ data = p.send(
+ CLIENT, Data(data=b"67890"), expect=[Data(data=b"67890"), EndOfMessage()]
+ )
+ assert data == b"67890"
+ data = p.send(CLIENT, EndOfMessage(), expect=[])
+ assert data == b""
+
+ for conn in p.conns:
+ assert conn.states == {CLIENT: DONE, SERVER: SEND_BODY}
+
+ data = p.send(SERVER, Data(data=b"1234567890"))
+ assert data == b"1234567890"
+ data = p.send(SERVER, Data(data=b"1"), expect=[Data(data=b"1"), EndOfMessage()])
+ assert data == b"1"
+ data = p.send(SERVER, EndOfMessage(), expect=[])
+ assert data == b""
+
+ for conn in p.conns:
+ assert conn.states == {CLIENT: DONE, SERVER: DONE}
+
+
+def test_chunked():
+ p = ConnectionPair()
+
+ p.send(
+ CLIENT,
+ Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "example.com"), ("Transfer-Encoding", "chunked")],
+ ),
+ )
+ data = p.send(CLIENT, Data(data=b"1234567890", chunk_start=True, chunk_end=True))
+ assert data == b"a\r\n1234567890\r\n"
+ data = p.send(CLIENT, Data(data=b"abcde", chunk_start=True, chunk_end=True))
+ assert data == b"5\r\nabcde\r\n"
+ data = p.send(CLIENT, Data(data=b""), expect=[])
+ assert data == b""
+ data = p.send(CLIENT, EndOfMessage(headers=[("hello", "there")]))
+ assert data == b"0\r\nhello: there\r\n\r\n"
+
+ p.send(
+ SERVER, Response(status_code=200, headers=[("Transfer-Encoding", "chunked")])
+ )
+ p.send(SERVER, Data(data=b"54321", chunk_start=True, chunk_end=True))
+ p.send(SERVER, Data(data=b"12345", chunk_start=True, chunk_end=True))
+ p.send(SERVER, EndOfMessage())
+
+ for conn in p.conns:
+ assert conn.states == {CLIENT: DONE, SERVER: DONE}
+
+
+def test_chunk_boundaries():
+ conn = Connection(our_role=SERVER)
+
+ request = (
+ b"POST / HTTP/1.1\r\n"
+ b"Host: example.com\r\n"
+ b"Transfer-Encoding: chunked\r\n"
+ b"\r\n"
+ )
+ conn.receive_data(request)
+ assert conn.next_event() == Request(
+ method="POST",
+ target="/",
+ headers=[("Host", "example.com"), ("Transfer-Encoding", "chunked")],
+ )
+ assert conn.next_event() is NEED_DATA
+
+ conn.receive_data(b"5\r\nhello\r\n")
+ assert conn.next_event() == Data(data=b"hello", chunk_start=True, chunk_end=True)
+
+ conn.receive_data(b"5\r\nhel")
+ assert conn.next_event() == Data(data=b"hel", chunk_start=True, chunk_end=False)
+
+ conn.receive_data(b"l")
+ assert conn.next_event() == Data(data=b"l", chunk_start=False, chunk_end=False)
+
+ conn.receive_data(b"o\r\n")
+ assert conn.next_event() == Data(data=b"o", chunk_start=False, chunk_end=True)
+
+ conn.receive_data(b"5\r\nhello")
+ assert conn.next_event() == Data(data=b"hello", chunk_start=True, chunk_end=True)
+
+ conn.receive_data(b"\r\n")
+ assert conn.next_event() == NEED_DATA
+
+ conn.receive_data(b"0\r\n\r\n")
+ assert conn.next_event() == EndOfMessage()
+
+
+def test_client_talking_to_http10_server():
+ c = Connection(CLIENT)
+ c.send(Request(method="GET", target="/", headers=[("Host", "example.com")]))
+ c.send(EndOfMessage())
+ assert c.our_state is DONE
+ # No content-length, so Http10 framing for body
+ assert receive_and_get(c, b"HTTP/1.0 200 OK\r\n\r\n") == [
+ Response(status_code=200, headers=[], http_version="1.0", reason=b"OK")
+ ]
+ assert c.our_state is MUST_CLOSE
+ assert receive_and_get(c, b"12345") == [Data(data=b"12345")]
+ assert receive_and_get(c, b"67890") == [Data(data=b"67890")]
+ assert receive_and_get(c, b"") == [EndOfMessage(), ConnectionClosed()]
+ assert c.their_state is CLOSED
+
+
+def test_server_talking_to_http10_client():
+ c = Connection(SERVER)
+ # No content-length, so no body
+ # NB: no host header
+ assert receive_and_get(c, b"GET / HTTP/1.0\r\n\r\n") == [
+ Request(method="GET", target="/", headers=[], http_version="1.0"),
+ EndOfMessage(),
+ ]
+ assert c.their_state is MUST_CLOSE
+
+ # We automatically Connection: close back at them
+ assert (
+ c.send(Response(status_code=200, headers=[]))
+ == b"HTTP/1.1 200 \r\nConnection: close\r\n\r\n"
+ )
+
+ assert c.send(Data(data=b"12345")) == b"12345"
+ assert c.send(EndOfMessage()) == b""
+ assert c.our_state is MUST_CLOSE
+
+ # Check that it works if they do send Content-Length
+ c = Connection(SERVER)
+ # NB: no host header
+ assert receive_and_get(c, b"POST / HTTP/1.0\r\nContent-Length: 10\r\n\r\n1") == [
+ Request(
+ method="POST",
+ target="/",
+ headers=[("Content-Length", "10")],
+ http_version="1.0",
+ ),
+ Data(data=b"1"),
+ ]
+ assert receive_and_get(c, b"234567890") == [Data(data=b"234567890"), EndOfMessage()]
+ assert c.their_state is MUST_CLOSE
+ assert receive_and_get(c, b"") == [ConnectionClosed()]
+
+
+def test_automatic_transfer_encoding_in_response():
+ # Check that in responses, the user can specify either Transfer-Encoding:
+ # chunked or no framing at all, and in both cases we automatically select
+ # the right option depending on whether the peer speaks HTTP/1.0 or
+ # HTTP/1.1
+ for user_headers in [
+ [("Transfer-Encoding", "chunked")],
+ [],
+ # In fact, this even works if Content-Length is set,
+ # because if both are set then Transfer-Encoding wins
+ [("Transfer-Encoding", "chunked"), ("Content-Length", "100")],
+ ]:
+ p = ConnectionPair()
+ p.send(
+ CLIENT,
+ [
+ Request(method="GET", target="/", headers=[("Host", "example.com")]),
+ EndOfMessage(),
+ ],
+ )
+ # When speaking to HTTP/1.1 client, all of the above cases get
+ # normalized to Transfer-Encoding: chunked
+ p.send(
+ SERVER,
+ Response(status_code=200, headers=user_headers),
+ expect=Response(
+ status_code=200, headers=[("Transfer-Encoding", "chunked")]
+ ),
+ )
+
+ # When speaking to HTTP/1.0 client, all of the above cases get
+ # normalized to no-framing-headers
+ c = Connection(SERVER)
+ receive_and_get(c, b"GET / HTTP/1.0\r\n\r\n")
+ assert (
+ c.send(Response(status_code=200, headers=user_headers))
+ == b"HTTP/1.1 200 \r\nConnection: close\r\n\r\n"
+ )
+ assert c.send(Data(data=b"12345")) == b"12345"
+
+
+def test_automagic_connection_close_handling():
+ p = ConnectionPair()
+ # If the user explicitly sets Connection: close, then we notice and
+ # respect it
+ p.send(
+ CLIENT,
+ [
+ Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "example.com"), ("Connection", "close")],
+ ),
+ EndOfMessage(),
+ ],
+ )
+ for conn in p.conns:
+ assert conn.states[CLIENT] is MUST_CLOSE
+ # And if the client sets it, the server automatically echoes it back
+ p.send(
+ SERVER,
+ # no header here...
+ [Response(status_code=204, headers=[]), EndOfMessage()],
+ # ...but oh look, it arrived anyway
+ expect=[
+ Response(status_code=204, headers=[("connection", "close")]),
+ EndOfMessage(),
+ ],
+ )
+ for conn in p.conns:
+ assert conn.states == {CLIENT: MUST_CLOSE, SERVER: MUST_CLOSE}
+
+
+def test_100_continue():
+ def setup():
+ p = ConnectionPair()
+ p.send(
+ CLIENT,
+ Request(
+ method="GET",
+ target="/",
+ headers=[
+ ("Host", "example.com"),
+ ("Content-Length", "100"),
+ ("Expect", "100-continue"),
+ ],
+ ),
+ )
+ for conn in p.conns:
+ assert conn.client_is_waiting_for_100_continue
+ assert not p.conn[CLIENT].they_are_waiting_for_100_continue
+ assert p.conn[SERVER].they_are_waiting_for_100_continue
+ return p
+
+ # Disabled by 100 Continue
+ p = setup()
+ p.send(SERVER, InformationalResponse(status_code=100, headers=[]))
+ for conn in p.conns:
+ assert not conn.client_is_waiting_for_100_continue
+ assert not conn.they_are_waiting_for_100_continue
+
+ # Disabled by a real response
+ p = setup()
+ p.send(
+ SERVER, Response(status_code=200, headers=[("Transfer-Encoding", "chunked")])
+ )
+ for conn in p.conns:
+ assert not conn.client_is_waiting_for_100_continue
+ assert not conn.they_are_waiting_for_100_continue
+
+ # Disabled by the client going ahead and sending stuff anyway
+ p = setup()
+ p.send(CLIENT, Data(data=b"12345"))
+ for conn in p.conns:
+ assert not conn.client_is_waiting_for_100_continue
+ assert not conn.they_are_waiting_for_100_continue
+
+
+def test_max_incomplete_event_size_countermeasure():
+ # Infinitely long headers are definitely not okay
+ c = Connection(SERVER)
+ c.receive_data(b"GET / HTTP/1.0\r\nEndless: ")
+ assert c.next_event() is NEED_DATA
+ with pytest.raises(RemoteProtocolError):
+ while True:
+ c.receive_data(b"a" * 1024)
+ c.next_event()
+
+ # Checking that the same header is accepted / rejected depending on the
+ # max_incomplete_event_size setting:
+ c = Connection(SERVER, max_incomplete_event_size=5000)
+ c.receive_data(b"GET / HTTP/1.0\r\nBig: ")
+ c.receive_data(b"a" * 4000)
+ c.receive_data(b"\r\n\r\n")
+ assert get_all_events(c) == [
+ Request(
+ method="GET", target="/", http_version="1.0", headers=[("big", "a" * 4000)]
+ ),
+ EndOfMessage(),
+ ]
+
+ c = Connection(SERVER, max_incomplete_event_size=4000)
+ c.receive_data(b"GET / HTTP/1.0\r\nBig: ")
+ c.receive_data(b"a" * 4000)
+ with pytest.raises(RemoteProtocolError):
+ c.next_event()
+
+ # Temporarily exceeding the size limit is fine, as long as its done with
+ # complete events:
+ c = Connection(SERVER, max_incomplete_event_size=5000)
+ c.receive_data(b"GET / HTTP/1.0\r\nContent-Length: 10000")
+ c.receive_data(b"\r\n\r\n" + b"a" * 10000)
+ assert get_all_events(c) == [
+ Request(
+ method="GET",
+ target="/",
+ http_version="1.0",
+ headers=[("Content-Length", "10000")],
+ ),
+ Data(data=b"a" * 10000),
+ EndOfMessage(),
+ ]
+
+ c = Connection(SERVER, max_incomplete_event_size=100)
+ # Two pipelined requests to create a way-too-big receive buffer... but
+ # it's fine because we're not checking
+ c.receive_data(
+ b"GET /1 HTTP/1.1\r\nHost: a\r\n\r\n"
+ b"GET /2 HTTP/1.1\r\nHost: b\r\n\r\n" + b"X" * 1000
+ )
+ assert get_all_events(c) == [
+ Request(method="GET", target="/1", headers=[("host", "a")]),
+ EndOfMessage(),
+ ]
+ # Even more data comes in, still no problem
+ c.receive_data(b"X" * 1000)
+ # We can respond and reuse to get the second pipelined request
+ c.send(Response(status_code=200, headers=[]))
+ c.send(EndOfMessage())
+ c.start_next_cycle()
+ assert get_all_events(c) == [
+ Request(method="GET", target="/2", headers=[("host", "b")]),
+ EndOfMessage(),
+ ]
+ # But once we unpause and try to read the next message, and find that it's
+ # incomplete and the buffer is *still* way too large, then *that's* a
+ # problem:
+ c.send(Response(status_code=200, headers=[]))
+ c.send(EndOfMessage())
+ c.start_next_cycle()
+ with pytest.raises(RemoteProtocolError):
+ c.next_event()
+
+
+def test_reuse_simple():
+ p = ConnectionPair()
+ p.send(
+ CLIENT,
+ [Request(method="GET", target="/", headers=[("Host", "a")]), EndOfMessage()],
+ )
+ p.send(SERVER, [Response(status_code=200, headers=[]), EndOfMessage()])
+ for conn in p.conns:
+ assert conn.states == {CLIENT: DONE, SERVER: DONE}
+ conn.start_next_cycle()
+
+ p.send(
+ CLIENT,
+ [
+ Request(method="DELETE", target="/foo", headers=[("Host", "a")]),
+ EndOfMessage(),
+ ],
+ )
+ p.send(SERVER, [Response(status_code=404, headers=[]), EndOfMessage()])
+
+
+def test_pipelining():
+ # Client doesn't support pipelining, so we have to do this by hand
+ c = Connection(SERVER)
+ assert c.next_event() is NEED_DATA
+ # 3 requests all bunched up
+ c.receive_data(
+ b"GET /1 HTTP/1.1\r\nHost: a.com\r\nContent-Length: 5\r\n\r\n"
+ b"12345"
+ b"GET /2 HTTP/1.1\r\nHost: a.com\r\nContent-Length: 5\r\n\r\n"
+ b"67890"
+ b"GET /3 HTTP/1.1\r\nHost: a.com\r\n\r\n"
+ )
+ assert get_all_events(c) == [
+ Request(
+ method="GET",
+ target="/1",
+ headers=[("Host", "a.com"), ("Content-Length", "5")],
+ ),
+ Data(data=b"12345"),
+ EndOfMessage(),
+ ]
+ assert c.their_state is DONE
+ assert c.our_state is SEND_RESPONSE
+
+ assert c.next_event() is PAUSED
+
+ c.send(Response(status_code=200, headers=[]))
+ c.send(EndOfMessage())
+ assert c.their_state is DONE
+ assert c.our_state is DONE
+
+ c.start_next_cycle()
+
+ assert get_all_events(c) == [
+ Request(
+ method="GET",
+ target="/2",
+ headers=[("Host", "a.com"), ("Content-Length", "5")],
+ ),
+ Data(data=b"67890"),
+ EndOfMessage(),
+ ]
+ assert c.next_event() is PAUSED
+ c.send(Response(status_code=200, headers=[]))
+ c.send(EndOfMessage())
+ c.start_next_cycle()
+
+ assert get_all_events(c) == [
+ Request(method="GET", target="/3", headers=[("Host", "a.com")]),
+ EndOfMessage(),
+ ]
+ # Doesn't pause this time, no trailing data
+ assert c.next_event() is NEED_DATA
+ c.send(Response(status_code=200, headers=[]))
+ c.send(EndOfMessage())
+
+ # Arrival of more data triggers pause
+ assert c.next_event() is NEED_DATA
+ c.receive_data(b"SADF")
+ assert c.next_event() is PAUSED
+ assert c.trailing_data == (b"SADF", False)
+ # If EOF arrives while paused, we don't see that either:
+ c.receive_data(b"")
+ assert c.trailing_data == (b"SADF", True)
+ assert c.next_event() is PAUSED
+ c.receive_data(b"")
+ assert c.next_event() is PAUSED
+ # Can't call receive_data with non-empty buf after closing it
+ with pytest.raises(RuntimeError):
+ c.receive_data(b"FDSA")
+
+
+def test_protocol_switch():
+ for (req, deny, accept) in [
+ (
+ Request(
+ method="CONNECT",
+ target="example.com:443",
+ headers=[("Host", "foo"), ("Content-Length", "1")],
+ ),
+ Response(status_code=404, headers=[]),
+ Response(status_code=200, headers=[]),
+ ),
+ (
+ Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "foo"), ("Content-Length", "1"), ("Upgrade", "a, b")],
+ ),
+ Response(status_code=200, headers=[]),
+ InformationalResponse(status_code=101, headers=[("Upgrade", "a")]),
+ ),
+ (
+ Request(
+ method="CONNECT",
+ target="example.com:443",
+ headers=[("Host", "foo"), ("Content-Length", "1"), ("Upgrade", "a, b")],
+ ),
+ Response(status_code=404, headers=[]),
+ # Accept CONNECT, not upgrade
+ Response(status_code=200, headers=[]),
+ ),
+ (
+ Request(
+ method="CONNECT",
+ target="example.com:443",
+ headers=[("Host", "foo"), ("Content-Length", "1"), ("Upgrade", "a, b")],
+ ),
+ Response(status_code=404, headers=[]),
+ # Accept Upgrade, not CONNECT
+ InformationalResponse(status_code=101, headers=[("Upgrade", "b")]),
+ ),
+ ]:
+
+ def setup():
+ p = ConnectionPair()
+ p.send(CLIENT, req)
+ # No switch-related state change stuff yet; the client has to
+ # finish the request before that kicks in
+ for conn in p.conns:
+ assert conn.states[CLIENT] is SEND_BODY
+ p.send(CLIENT, [Data(data=b"1"), EndOfMessage()])
+ for conn in p.conns:
+ assert conn.states[CLIENT] is MIGHT_SWITCH_PROTOCOL
+ assert p.conn[SERVER].next_event() is PAUSED
+ return p
+
+ # Test deny case
+ p = setup()
+ p.send(SERVER, deny)
+ for conn in p.conns:
+ assert conn.states == {CLIENT: DONE, SERVER: SEND_BODY}
+ p.send(SERVER, EndOfMessage())
+ # Check that re-use is still allowed after a denial
+ for conn in p.conns:
+ conn.start_next_cycle()
+
+ # Test accept case
+ p = setup()
+ p.send(SERVER, accept)
+ for conn in p.conns:
+ assert conn.states == {CLIENT: SWITCHED_PROTOCOL, SERVER: SWITCHED_PROTOCOL}
+ conn.receive_data(b"123")
+ assert conn.next_event() is PAUSED
+ conn.receive_data(b"456")
+ assert conn.next_event() is PAUSED
+ assert conn.trailing_data == (b"123456", False)
+
+ # Pausing in might-switch, then recovery
+ # (weird artificial case where the trailing data actually is valid
+ # HTTP for some reason, because this makes it easier to test the state
+ # logic)
+ p = setup()
+ sc = p.conn[SERVER]
+ sc.receive_data(b"GET / HTTP/1.0\r\n\r\n")
+ assert sc.next_event() is PAUSED
+ assert sc.trailing_data == (b"GET / HTTP/1.0\r\n\r\n", False)
+ sc.send(deny)
+ assert sc.next_event() is PAUSED
+ sc.send(EndOfMessage())
+ sc.start_next_cycle()
+ assert get_all_events(sc) == [
+ Request(method="GET", target="/", headers=[], http_version="1.0"),
+ EndOfMessage(),
+ ]
+
+ # When we're DONE, have no trailing data, and the connection gets
+ # closed, we report ConnectionClosed(). When we're in might-switch or
+ # switched, we don't.
+ p = setup()
+ sc = p.conn[SERVER]
+ sc.receive_data(b"")
+ assert sc.next_event() is PAUSED
+ assert sc.trailing_data == (b"", True)
+ p.send(SERVER, accept)
+ assert sc.next_event() is PAUSED
+
+ p = setup()
+ sc = p.conn[SERVER]
+ sc.receive_data(b"") == []
+ assert sc.next_event() is PAUSED
+ sc.send(deny)
+ assert sc.next_event() == ConnectionClosed()
+
+ # You can't send after switching protocols, or while waiting for a
+ # protocol switch
+ p = setup()
+ with pytest.raises(LocalProtocolError):
+ p.conn[CLIENT].send(
+ Request(method="GET", target="/", headers=[("Host", "a")])
+ )
+ p = setup()
+ p.send(SERVER, accept)
+ with pytest.raises(LocalProtocolError):
+ p.conn[SERVER].send(Data(data=b"123"))
+
+
+def test_close_simple():
+ # Just immediately closing a new connection without anything having
+ # happened yet.
+ for (who_shot_first, who_shot_second) in [(CLIENT, SERVER), (SERVER, CLIENT)]:
+
+ def setup():
+ p = ConnectionPair()
+ p.send(who_shot_first, ConnectionClosed())
+ for conn in p.conns:
+ assert conn.states == {
+ who_shot_first: CLOSED,
+ who_shot_second: MUST_CLOSE,
+ }
+ return p
+
+ # You can keep putting b"" into a closed connection, and you keep
+ # getting ConnectionClosed() out:
+ p = setup()
+ assert p.conn[who_shot_second].next_event() == ConnectionClosed()
+ assert p.conn[who_shot_second].next_event() == ConnectionClosed()
+ p.conn[who_shot_second].receive_data(b"")
+ assert p.conn[who_shot_second].next_event() == ConnectionClosed()
+ # Second party can close...
+ p = setup()
+ p.send(who_shot_second, ConnectionClosed())
+ for conn in p.conns:
+ assert conn.our_state is CLOSED
+ assert conn.their_state is CLOSED
+ # But trying to receive new data on a closed connection is a
+ # RuntimeError (not ProtocolError, because the problem here isn't
+ # violation of HTTP, it's violation of physics)
+ p = setup()
+ with pytest.raises(RuntimeError):
+ p.conn[who_shot_second].receive_data(b"123")
+ # And receiving new data on a MUST_CLOSE connection is a ProtocolError
+ p = setup()
+ p.conn[who_shot_first].receive_data(b"GET")
+ with pytest.raises(RemoteProtocolError):
+ p.conn[who_shot_first].next_event()
+
+
+def test_close_different_states():
+ req = [
+ Request(method="GET", target="/foo", headers=[("Host", "a")]),
+ EndOfMessage(),
+ ]
+ resp = [Response(status_code=200, headers=[]), EndOfMessage()]
+
+ # Client before request
+ p = ConnectionPair()
+ p.send(CLIENT, ConnectionClosed())
+ for conn in p.conns:
+ assert conn.states == {CLIENT: CLOSED, SERVER: MUST_CLOSE}
+
+ # Client after request
+ p = ConnectionPair()
+ p.send(CLIENT, req)
+ p.send(CLIENT, ConnectionClosed())
+ for conn in p.conns:
+ assert conn.states == {CLIENT: CLOSED, SERVER: SEND_RESPONSE}
+
+ # Server after request -> not allowed
+ p = ConnectionPair()
+ p.send(CLIENT, req)
+ with pytest.raises(LocalProtocolError):
+ p.conn[SERVER].send(ConnectionClosed())
+ p.conn[CLIENT].receive_data(b"")
+ with pytest.raises(RemoteProtocolError):
+ p.conn[CLIENT].next_event()
+
+ # Server after response
+ p = ConnectionPair()
+ p.send(CLIENT, req)
+ p.send(SERVER, resp)
+ p.send(SERVER, ConnectionClosed())
+ for conn in p.conns:
+ assert conn.states == {CLIENT: MUST_CLOSE, SERVER: CLOSED}
+
+ # Both after closing (ConnectionClosed() is idempotent)
+ p = ConnectionPair()
+ p.send(CLIENT, req)
+ p.send(SERVER, resp)
+ p.send(CLIENT, ConnectionClosed())
+ p.send(SERVER, ConnectionClosed())
+ p.send(CLIENT, ConnectionClosed())
+ p.send(SERVER, ConnectionClosed())
+
+ # In the middle of sending -> not allowed
+ p = ConnectionPair()
+ p.send(
+ CLIENT,
+ Request(
+ method="GET", target="/", headers=[("Host", "a"), ("Content-Length", "10")]
+ ),
+ )
+ with pytest.raises(LocalProtocolError):
+ p.conn[CLIENT].send(ConnectionClosed())
+ p.conn[SERVER].receive_data(b"")
+ with pytest.raises(RemoteProtocolError):
+ p.conn[SERVER].next_event()
+
+
+# Receive several requests and then client shuts down their side of the
+# connection; we can respond to each
+def test_pipelined_close():
+ c = Connection(SERVER)
+ # 2 requests then a close
+ c.receive_data(
+ b"GET /1 HTTP/1.1\r\nHost: a.com\r\nContent-Length: 5\r\n\r\n"
+ b"12345"
+ b"GET /2 HTTP/1.1\r\nHost: a.com\r\nContent-Length: 5\r\n\r\n"
+ b"67890"
+ )
+ c.receive_data(b"")
+ assert get_all_events(c) == [
+ Request(
+ method="GET",
+ target="/1",
+ headers=[("host", "a.com"), ("content-length", "5")],
+ ),
+ Data(data=b"12345"),
+ EndOfMessage(),
+ ]
+ assert c.states[CLIENT] is DONE
+ c.send(Response(status_code=200, headers=[]))
+ c.send(EndOfMessage())
+ assert c.states[SERVER] is DONE
+ c.start_next_cycle()
+ assert get_all_events(c) == [
+ Request(
+ method="GET",
+ target="/2",
+ headers=[("host", "a.com"), ("content-length", "5")],
+ ),
+ Data(data=b"67890"),
+ EndOfMessage(),
+ ConnectionClosed(),
+ ]
+ assert c.states == {CLIENT: CLOSED, SERVER: SEND_RESPONSE}
+ c.send(Response(status_code=200, headers=[]))
+ c.send(EndOfMessage())
+ assert c.states == {CLIENT: CLOSED, SERVER: MUST_CLOSE}
+ c.send(ConnectionClosed())
+ assert c.states == {CLIENT: CLOSED, SERVER: CLOSED}
+
+
+def test_sendfile():
+ class SendfilePlaceholder:
+ def __len__(self):
+ return 10
+
+ placeholder = SendfilePlaceholder()
+
+ def setup(header, http_version):
+ c = Connection(SERVER)
+ receive_and_get(
+ c, "GET / HTTP/{}\r\nHost: a\r\n\r\n".format(http_version).encode("ascii")
+ )
+ headers = []
+ if header:
+ headers.append(header)
+ c.send(Response(status_code=200, headers=headers))
+ return c, c.send_with_data_passthrough(Data(data=placeholder))
+
+ c, data = setup(("Content-Length", "10"), "1.1")
+ assert data == [placeholder]
+ # Raises an error if the connection object doesn't think we've sent
+ # exactly 10 bytes
+ c.send(EndOfMessage())
+
+ _, data = setup(("Transfer-Encoding", "chunked"), "1.1")
+ assert placeholder in data
+ data[data.index(placeholder)] = b"x" * 10
+ assert b"".join(data) == b"a\r\nxxxxxxxxxx\r\n"
+
+ c, data = setup(None, "1.0")
+ assert data == [placeholder]
+ assert c.our_state is SEND_BODY
+
+
+def test_errors():
+ # After a receive error, you can't receive
+ for role in [CLIENT, SERVER]:
+ c = Connection(our_role=role)
+ c.receive_data(b"gibberish\r\n\r\n")
+ with pytest.raises(RemoteProtocolError):
+ c.next_event()
+ # Now any attempt to receive continues to raise
+ assert c.their_state is ERROR
+ assert c.our_state is not ERROR
+ print(c._cstate.states)
+ with pytest.raises(RemoteProtocolError):
+ c.next_event()
+ # But we can still yell at the client for sending us gibberish
+ if role is SERVER:
+ assert (
+ c.send(Response(status_code=400, headers=[]))
+ == b"HTTP/1.1 400 \r\nConnection: close\r\n\r\n"
+ )
+
+ # After an error sending, you can no longer send
+ # (This is especially important for things like content-length errors,
+ # where there's complex internal state being modified)
+ def conn(role):
+ c = Connection(our_role=role)
+ if role is SERVER:
+ # Put it into the state where it *could* send a response...
+ receive_and_get(c, b"GET / HTTP/1.0\r\n\r\n")
+ assert c.our_state is SEND_RESPONSE
+ return c
+
+ for role in [CLIENT, SERVER]:
+ if role is CLIENT:
+ # This HTTP/1.0 request won't be detected as bad until after we go
+ # through the state machine and hit the writing code
+ good = Request(method="GET", target="/", headers=[("Host", "example.com")])
+ bad = Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "example.com")],
+ http_version="1.0",
+ )
+ elif role is SERVER:
+ good = Response(status_code=200, headers=[])
+ bad = Response(status_code=200, headers=[], http_version="1.0")
+ # Make sure 'good' actually is good
+ c = conn(role)
+ c.send(good)
+ assert c.our_state is not ERROR
+ # Do that again, but this time sending 'bad' first
+ c = conn(role)
+ with pytest.raises(LocalProtocolError):
+ c.send(bad)
+ assert c.our_state is ERROR
+ assert c.their_state is not ERROR
+ # Now 'good' is not so good
+ with pytest.raises(LocalProtocolError):
+ c.send(good)
+
+ # And check send_failed() too
+ c = conn(role)
+ c.send_failed()
+ assert c.our_state is ERROR
+ assert c.their_state is not ERROR
+ # This is idempotent
+ c.send_failed()
+ assert c.our_state is ERROR
+ assert c.their_state is not ERROR
+
+
+def test_idle_receive_nothing():
+ # At one point this incorrectly raised an error
+ for role in [CLIENT, SERVER]:
+ c = Connection(role)
+ assert c.next_event() is NEED_DATA
+
+
+def test_connection_drop():
+ c = Connection(SERVER)
+ c.receive_data(b"GET /")
+ assert c.next_event() is NEED_DATA
+ c.receive_data(b"")
+ with pytest.raises(RemoteProtocolError):
+ c.next_event()
+
+
+def test_408_request_timeout():
+ # Should be able to send this spontaneously as a server without seeing
+ # anything from client
+ p = ConnectionPair()
+ p.send(SERVER, Response(status_code=408, headers=[]))
+
+
+# This used to raise IndexError
+def test_empty_request():
+ c = Connection(SERVER)
+ c.receive_data(b"\r\n")
+ with pytest.raises(RemoteProtocolError):
+ c.next_event()
+
+
+# This used to raise IndexError
+def test_empty_response():
+ c = Connection(CLIENT)
+ c.send(Request(method="GET", target="/", headers=[("Host", "a")]))
+ c.receive_data(b"\r\n")
+ with pytest.raises(RemoteProtocolError):
+ c.next_event()
+
+
+@pytest.mark.parametrize(
+ "data",
+ [
+ b"\x00",
+ b"\x20",
+ b"\x16\x03\x01\x00\xa5", # Typical start of a TLS Client Hello
+ ],
+)
+def test_early_detection_of_invalid_request(data):
+ c = Connection(SERVER)
+ # Early detection should occur before even receiving a `\r\n`
+ c.receive_data(data)
+ with pytest.raises(RemoteProtocolError):
+ c.next_event()
+
+
+@pytest.mark.parametrize(
+ "data",
+ [
+ b"\x00",
+ b"\x20",
+ b"\x16\x03\x03\x00\x31", # Typical start of a TLS Server Hello
+ ],
+)
+def test_early_detection_of_invalid_response(data):
+ c = Connection(CLIENT)
+ # Early detection should occur before even receiving a `\r\n`
+ c.receive_data(data)
+ with pytest.raises(RemoteProtocolError):
+ c.next_event()
+
+
+# This used to give different headers for HEAD and GET.
+# The correct way to handle HEAD is to put whatever headers we *would* have
+# put if it were a GET -- even though we know that for HEAD, those headers
+# will be ignored.
+def test_HEAD_framing_headers():
+ def setup(method, http_version):
+ c = Connection(SERVER)
+ c.receive_data(
+ method + b" / HTTP/" + http_version + b"\r\n" + b"Host: example.com\r\n\r\n"
+ )
+ assert type(c.next_event()) is Request
+ assert type(c.next_event()) is EndOfMessage
+ return c
+
+ for method in [b"GET", b"HEAD"]:
+ # No Content-Length, HTTP/1.1 peer, should use chunked
+ c = setup(method, b"1.1")
+ assert (
+ c.send(Response(status_code=200, headers=[])) == b"HTTP/1.1 200 \r\n"
+ b"Transfer-Encoding: chunked\r\n\r\n"
+ )
+
+ # No Content-Length, HTTP/1.0 peer, frame with connection: close
+ c = setup(method, b"1.0")
+ assert (
+ c.send(Response(status_code=200, headers=[])) == b"HTTP/1.1 200 \r\n"
+ b"Connection: close\r\n\r\n"
+ )
+
+ # Content-Length + Transfer-Encoding, TE wins
+ c = setup(method, b"1.1")
+ assert (
+ c.send(
+ Response(
+ status_code=200,
+ headers=[
+ ("Content-Length", "100"),
+ ("Transfer-Encoding", "chunked"),
+ ],
+ )
+ )
+ == b"HTTP/1.1 200 \r\n"
+ b"Transfer-Encoding: chunked\r\n\r\n"
+ )
+
+
+def test_special_exceptions_for_lost_connection_in_message_body():
+ c = Connection(SERVER)
+ c.receive_data(
+ b"POST / HTTP/1.1\r\n" b"Host: example.com\r\n" b"Content-Length: 100\r\n\r\n"
+ )
+ assert type(c.next_event()) is Request
+ assert c.next_event() is NEED_DATA
+ c.receive_data(b"12345")
+ assert c.next_event() == Data(data=b"12345")
+ c.receive_data(b"")
+ with pytest.raises(RemoteProtocolError) as excinfo:
+ c.next_event()
+ assert "received 5 bytes" in str(excinfo.value)
+ assert "expected 100" in str(excinfo.value)
+
+ c = Connection(SERVER)
+ c.receive_data(
+ b"POST / HTTP/1.1\r\n"
+ b"Host: example.com\r\n"
+ b"Transfer-Encoding: chunked\r\n\r\n"
+ )
+ assert type(c.next_event()) is Request
+ assert c.next_event() is NEED_DATA
+ c.receive_data(b"8\r\n012345")
+ assert c.next_event().data == b"012345"
+ c.receive_data(b"")
+ with pytest.raises(RemoteProtocolError) as excinfo:
+ c.next_event()
+ assert "incomplete chunked read" in str(excinfo.value)
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/test_events.py b/.venv/lib/python3.9/site-packages/h11/tests/test_events.py
new file mode 100644
index 0000000..e20f741
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/tests/test_events.py
@@ -0,0 +1,179 @@
+from http import HTTPStatus
+
+import pytest
+
+from .. import _events
+from .._events import *
+from .._util import LocalProtocolError
+
+
+def test_event_bundle():
+ class T(_events._EventBundle):
+ _fields = ["a", "b"]
+ _defaults = {"b": 1}
+
+ def _validate(self):
+ if self.a == 0:
+ raise ValueError
+
+ # basic construction and methods
+ t = T(a=1, b=0)
+ assert repr(t) == "T(a=1, b=0)"
+ assert t == T(a=1, b=0)
+ assert not (t == T(a=2, b=0))
+ assert not (t != T(a=1, b=0))
+ assert t != T(a=2, b=0)
+ with pytest.raises(TypeError):
+ hash(t)
+
+ # check defaults
+ t = T(a=10)
+ assert t.a == 10
+ assert t.b == 1
+
+ # no positional args
+ with pytest.raises(TypeError):
+ T(1)
+
+ with pytest.raises(TypeError):
+ T(1, a=1, b=0)
+
+ # unknown field
+ with pytest.raises(TypeError):
+ T(a=1, b=0, c=10)
+
+ # missing required field
+ with pytest.raises(TypeError) as exc:
+ T(b=0)
+ # make sure we error on the right missing kwarg
+ assert "kwarg a" in str(exc.value)
+
+ # _validate is called
+ with pytest.raises(ValueError):
+ T(a=0, b=0)
+
+
+def test_events():
+ with pytest.raises(LocalProtocolError):
+ # Missing Host:
+ req = Request(
+ method="GET", target="/", headers=[("a", "b")], http_version="1.1"
+ )
+ # But this is okay (HTTP/1.0)
+ req = Request(method="GET", target="/", headers=[("a", "b")], http_version="1.0")
+ # fields are normalized
+ assert req.method == b"GET"
+ assert req.target == b"/"
+ assert req.headers == [(b"a", b"b")]
+ assert req.http_version == b"1.0"
+
+ # This is also okay -- has a Host (with weird capitalization, which is ok)
+ req = Request(
+ method="GET",
+ target="/",
+ headers=[("a", "b"), ("hOSt", "example.com")],
+ http_version="1.1",
+ )
+ # we normalize header capitalization
+ assert req.headers == [(b"a", b"b"), (b"host", b"example.com")]
+
+ # Multiple host is bad too
+ with pytest.raises(LocalProtocolError):
+ req = Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "a"), ("Host", "a")],
+ http_version="1.1",
+ )
+ # Even for HTTP/1.0
+ with pytest.raises(LocalProtocolError):
+ req = Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "a"), ("Host", "a")],
+ http_version="1.0",
+ )
+
+ # Header values are validated
+ for bad_char in "\x00\r\n\f\v":
+ with pytest.raises(LocalProtocolError):
+ req = Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "a"), ("Foo", "asd" + bad_char)],
+ http_version="1.0",
+ )
+
+ # But for compatibility we allow non-whitespace control characters, even
+ # though they're forbidden by the spec.
+ Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "a"), ("Foo", "asd\x01\x02\x7f")],
+ http_version="1.0",
+ )
+
+ # Request target is validated
+ for bad_char in b"\x00\x20\x7f\xee":
+ target = bytearray(b"/")
+ target.append(bad_char)
+ with pytest.raises(LocalProtocolError):
+ Request(
+ method="GET", target=target, headers=[("Host", "a")], http_version="1.1"
+ )
+
+ ir = InformationalResponse(status_code=100, headers=[("Host", "a")])
+ assert ir.status_code == 100
+ assert ir.headers == [(b"host", b"a")]
+ assert ir.http_version == b"1.1"
+
+ with pytest.raises(LocalProtocolError):
+ InformationalResponse(status_code=200, headers=[("Host", "a")])
+
+ resp = Response(status_code=204, headers=[], http_version="1.0")
+ assert resp.status_code == 204
+ assert resp.headers == []
+ assert resp.http_version == b"1.0"
+
+ with pytest.raises(LocalProtocolError):
+ resp = Response(status_code=100, headers=[], http_version="1.0")
+
+ with pytest.raises(LocalProtocolError):
+ Response(status_code="100", headers=[], http_version="1.0")
+
+ with pytest.raises(LocalProtocolError):
+ InformationalResponse(status_code=b"100", headers=[], http_version="1.0")
+
+ d = Data(data=b"asdf")
+ assert d.data == b"asdf"
+
+ eom = EndOfMessage()
+ assert eom.headers == []
+
+ cc = ConnectionClosed()
+ assert repr(cc) == "ConnectionClosed()"
+
+
+def test_intenum_status_code():
+ # https://github.com/python-hyper/h11/issues/72
+
+ r = Response(status_code=HTTPStatus.OK, headers=[], http_version="1.0")
+ assert r.status_code == HTTPStatus.OK
+ assert type(r.status_code) is not type(HTTPStatus.OK)
+ assert type(r.status_code) is int
+
+
+def test_header_casing():
+ r = Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "example.org"), ("Connection", "keep-alive")],
+ http_version="1.1",
+ )
+ assert len(r.headers) == 2
+ assert r.headers[0] == (b"host", b"example.org")
+ assert r.headers == [(b"host", b"example.org"), (b"connection", b"keep-alive")]
+ assert r.headers.raw_items() == [
+ (b"Host", b"example.org"),
+ (b"Connection", b"keep-alive"),
+ ]
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/test_headers.py b/.venv/lib/python3.9/site-packages/h11/tests/test_headers.py
new file mode 100644
index 0000000..ff3dc8d
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/tests/test_headers.py
@@ -0,0 +1,151 @@
+import pytest
+
+from .._headers import *
+
+
+def test_normalize_and_validate():
+ assert normalize_and_validate([("foo", "bar")]) == [(b"foo", b"bar")]
+ assert normalize_and_validate([(b"foo", b"bar")]) == [(b"foo", b"bar")]
+
+ # no leading/trailing whitespace in names
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate([(b"foo ", "bar")])
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate([(b" foo", "bar")])
+
+ # no weird characters in names
+ with pytest.raises(LocalProtocolError) as excinfo:
+ normalize_and_validate([(b"foo bar", b"baz")])
+ assert "foo bar" in str(excinfo.value)
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate([(b"foo\x00bar", b"baz")])
+ # Not even 8-bit characters:
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate([(b"foo\xffbar", b"baz")])
+ # And not even the control characters we allow in values:
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate([(b"foo\x01bar", b"baz")])
+
+ # no return or NUL characters in values
+ with pytest.raises(LocalProtocolError) as excinfo:
+ normalize_and_validate([("foo", "bar\rbaz")])
+ assert "bar\\rbaz" in str(excinfo.value)
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate([("foo", "bar\nbaz")])
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate([("foo", "bar\x00baz")])
+ # no leading/trailing whitespace
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate([("foo", "barbaz ")])
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate([("foo", " barbaz")])
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate([("foo", "barbaz\t")])
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate([("foo", "\tbarbaz")])
+
+ # content-length
+ assert normalize_and_validate([("Content-Length", "1")]) == [
+ (b"content-length", b"1")
+ ]
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate([("Content-Length", "asdf")])
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate([("Content-Length", "1x")])
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate([("Content-Length", "1"), ("Content-Length", "2")])
+ assert normalize_and_validate(
+ [("Content-Length", "0"), ("Content-Length", "0")]
+ ) == [(b"content-length", b"0")]
+ assert normalize_and_validate([("Content-Length", "0 , 0")]) == [
+ (b"content-length", b"0")
+ ]
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate(
+ [("Content-Length", "1"), ("Content-Length", "1"), ("Content-Length", "2")]
+ )
+ with pytest.raises(LocalProtocolError):
+ normalize_and_validate([("Content-Length", "1 , 1,2")])
+
+ # transfer-encoding
+ assert normalize_and_validate([("Transfer-Encoding", "chunked")]) == [
+ (b"transfer-encoding", b"chunked")
+ ]
+ assert normalize_and_validate([("Transfer-Encoding", "cHuNkEd")]) == [
+ (b"transfer-encoding", b"chunked")
+ ]
+ with pytest.raises(LocalProtocolError) as excinfo:
+ normalize_and_validate([("Transfer-Encoding", "gzip")])
+ assert excinfo.value.error_status_hint == 501 # Not Implemented
+ with pytest.raises(LocalProtocolError) as excinfo:
+ normalize_and_validate(
+ [("Transfer-Encoding", "chunked"), ("Transfer-Encoding", "gzip")]
+ )
+ assert excinfo.value.error_status_hint == 501 # Not Implemented
+
+
+def test_get_set_comma_header():
+ headers = normalize_and_validate(
+ [
+ ("Connection", "close"),
+ ("whatever", "something"),
+ ("connectiON", "fOo,, , BAR"),
+ ]
+ )
+
+ assert get_comma_header(headers, b"connection") == [b"close", b"foo", b"bar"]
+
+ headers = set_comma_header(headers, b"newthing", ["a", "b"])
+
+ with pytest.raises(LocalProtocolError):
+ set_comma_header(headers, b"newthing", [" a", "b"])
+
+ assert headers == [
+ (b"connection", b"close"),
+ (b"whatever", b"something"),
+ (b"connection", b"fOo,, , BAR"),
+ (b"newthing", b"a"),
+ (b"newthing", b"b"),
+ ]
+
+ headers = set_comma_header(headers, b"whatever", ["different thing"])
+
+ assert headers == [
+ (b"connection", b"close"),
+ (b"connection", b"fOo,, , BAR"),
+ (b"newthing", b"a"),
+ (b"newthing", b"b"),
+ (b"whatever", b"different thing"),
+ ]
+
+
+def test_has_100_continue():
+ from .._events import Request
+
+ assert has_expect_100_continue(
+ Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "example.com"), ("Expect", "100-continue")],
+ )
+ )
+ assert not has_expect_100_continue(
+ Request(method="GET", target="/", headers=[("Host", "example.com")])
+ )
+ # Case insensitive
+ assert has_expect_100_continue(
+ Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "example.com"), ("Expect", "100-Continue")],
+ )
+ )
+ # Doesn't work in HTTP/1.0
+ assert not has_expect_100_continue(
+ Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "example.com"), ("Expect", "100-continue")],
+ http_version="1.0",
+ )
+ )
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/test_helpers.py b/.venv/lib/python3.9/site-packages/h11/tests/test_helpers.py
new file mode 100644
index 0000000..1477947
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/tests/test_helpers.py
@@ -0,0 +1,23 @@
+from .helpers import *
+
+
+def test_normalize_data_events():
+ assert normalize_data_events(
+ [
+ Data(data=bytearray(b"1")),
+ Data(data=b"2"),
+ Response(status_code=200, headers=[]),
+ Data(data=b"3"),
+ Data(data=b"4"),
+ EndOfMessage(),
+ Data(data=b"5"),
+ Data(data=b"6"),
+ Data(data=b"7"),
+ ]
+ ) == [
+ Data(data=b"12"),
+ Response(status_code=200, headers=[]),
+ Data(data=b"34"),
+ EndOfMessage(),
+ Data(data=b"567"),
+ ]
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/test_io.py b/.venv/lib/python3.9/site-packages/h11/tests/test_io.py
new file mode 100644
index 0000000..459a627
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/tests/test_io.py
@@ -0,0 +1,544 @@
+import pytest
+
+from .._events import *
+from .._headers import Headers, normalize_and_validate
+from .._readers import (
+ _obsolete_line_fold,
+ ChunkedReader,
+ ContentLengthReader,
+ Http10Reader,
+ READERS,
+)
+from .._receivebuffer import ReceiveBuffer
+from .._state import *
+from .._util import LocalProtocolError
+from .._writers import (
+ ChunkedWriter,
+ ContentLengthWriter,
+ Http10Writer,
+ write_any_response,
+ write_headers,
+ write_request,
+ WRITERS,
+)
+from .helpers import normalize_data_events
+
+SIMPLE_CASES = [
+ (
+ (CLIENT, IDLE),
+ Request(
+ method="GET",
+ target="/a",
+ headers=[("Host", "foo"), ("Connection", "close")],
+ ),
+ b"GET /a HTTP/1.1\r\nHost: foo\r\nConnection: close\r\n\r\n",
+ ),
+ (
+ (SERVER, SEND_RESPONSE),
+ Response(status_code=200, headers=[("Connection", "close")], reason=b"OK"),
+ b"HTTP/1.1 200 OK\r\nConnection: close\r\n\r\n",
+ ),
+ (
+ (SERVER, SEND_RESPONSE),
+ Response(status_code=200, headers=[], reason=b"OK"),
+ b"HTTP/1.1 200 OK\r\n\r\n",
+ ),
+ (
+ (SERVER, SEND_RESPONSE),
+ InformationalResponse(
+ status_code=101, headers=[("Upgrade", "websocket")], reason=b"Upgrade"
+ ),
+ b"HTTP/1.1 101 Upgrade\r\nUpgrade: websocket\r\n\r\n",
+ ),
+ (
+ (SERVER, SEND_RESPONSE),
+ InformationalResponse(status_code=101, headers=[], reason=b"Upgrade"),
+ b"HTTP/1.1 101 Upgrade\r\n\r\n",
+ ),
+]
+
+
+def dowrite(writer, obj):
+ got_list = []
+ writer(obj, got_list.append)
+ return b"".join(got_list)
+
+
+def tw(writer, obj, expected):
+ got = dowrite(writer, obj)
+ assert got == expected
+
+
+def makebuf(data):
+ buf = ReceiveBuffer()
+ buf += data
+ return buf
+
+
+def tr(reader, data, expected):
+ def check(got):
+ assert got == expected
+ # Headers should always be returned as bytes, not e.g. bytearray
+ # https://github.com/python-hyper/wsproto/pull/54#issuecomment-377709478
+ for name, value in getattr(got, "headers", []):
+ print(name, value)
+ assert type(name) is bytes
+ assert type(value) is bytes
+
+ # Simple: consume whole thing
+ buf = makebuf(data)
+ check(reader(buf))
+ assert not buf
+
+ # Incrementally growing buffer
+ buf = ReceiveBuffer()
+ for i in range(len(data)):
+ assert reader(buf) is None
+ buf += data[i : i + 1]
+ check(reader(buf))
+
+ # Trailing data
+ buf = makebuf(data)
+ buf += b"trailing"
+ check(reader(buf))
+ assert bytes(buf) == b"trailing"
+
+
+def test_writers_simple():
+ for ((role, state), event, binary) in SIMPLE_CASES:
+ tw(WRITERS[role, state], event, binary)
+
+
+def test_readers_simple():
+ for ((role, state), event, binary) in SIMPLE_CASES:
+ tr(READERS[role, state], binary, event)
+
+
+def test_writers_unusual():
+ # Simple test of the write_headers utility routine
+ tw(
+ write_headers,
+ normalize_and_validate([("foo", "bar"), ("baz", "quux")]),
+ b"foo: bar\r\nbaz: quux\r\n\r\n",
+ )
+ tw(write_headers, Headers([]), b"\r\n")
+
+ # We understand HTTP/1.0, but we don't speak it
+ with pytest.raises(LocalProtocolError):
+ tw(
+ write_request,
+ Request(
+ method="GET",
+ target="/",
+ headers=[("Host", "foo"), ("Connection", "close")],
+ http_version="1.0",
+ ),
+ None,
+ )
+ with pytest.raises(LocalProtocolError):
+ tw(
+ write_any_response,
+ Response(
+ status_code=200, headers=[("Connection", "close")], http_version="1.0"
+ ),
+ None,
+ )
+
+
+def test_readers_unusual():
+ # Reading HTTP/1.0
+ tr(
+ READERS[CLIENT, IDLE],
+ b"HEAD /foo HTTP/1.0\r\nSome: header\r\n\r\n",
+ Request(
+ method="HEAD",
+ target="/foo",
+ headers=[("Some", "header")],
+ http_version="1.0",
+ ),
+ )
+
+ # check no-headers, since it's only legal with HTTP/1.0
+ tr(
+ READERS[CLIENT, IDLE],
+ b"HEAD /foo HTTP/1.0\r\n\r\n",
+ Request(method="HEAD", target="/foo", headers=[], http_version="1.0"),
+ )
+
+ tr(
+ READERS[SERVER, SEND_RESPONSE],
+ b"HTTP/1.0 200 OK\r\nSome: header\r\n\r\n",
+ Response(
+ status_code=200,
+ headers=[("Some", "header")],
+ http_version="1.0",
+ reason=b"OK",
+ ),
+ )
+
+ # single-character header values (actually disallowed by the ABNF in RFC
+ # 7230 -- this is a bug in the standard that we originally copied...)
+ tr(
+ READERS[SERVER, SEND_RESPONSE],
+ b"HTTP/1.0 200 OK\r\n" b"Foo: a a a a a \r\n\r\n",
+ Response(
+ status_code=200,
+ headers=[("Foo", "a a a a a")],
+ http_version="1.0",
+ reason=b"OK",
+ ),
+ )
+
+ # Empty headers -- also legal
+ tr(
+ READERS[SERVER, SEND_RESPONSE],
+ b"HTTP/1.0 200 OK\r\n" b"Foo:\r\n\r\n",
+ Response(
+ status_code=200, headers=[("Foo", "")], http_version="1.0", reason=b"OK"
+ ),
+ )
+
+ tr(
+ READERS[SERVER, SEND_RESPONSE],
+ b"HTTP/1.0 200 OK\r\n" b"Foo: \t \t \r\n\r\n",
+ Response(
+ status_code=200, headers=[("Foo", "")], http_version="1.0", reason=b"OK"
+ ),
+ )
+
+ # Tolerate broken servers that leave off the response code
+ tr(
+ READERS[SERVER, SEND_RESPONSE],
+ b"HTTP/1.0 200\r\n" b"Foo: bar\r\n\r\n",
+ Response(
+ status_code=200, headers=[("Foo", "bar")], http_version="1.0", reason=b""
+ ),
+ )
+
+ # Tolerate headers line endings (\r\n and \n)
+ # \n\r\b between headers and body
+ tr(
+ READERS[SERVER, SEND_RESPONSE],
+ b"HTTP/1.1 200 OK\r\nSomeHeader: val\n\r\n",
+ Response(
+ status_code=200,
+ headers=[("SomeHeader", "val")],
+ http_version="1.1",
+ reason="OK",
+ ),
+ )
+
+ # delimited only with \n
+ tr(
+ READERS[SERVER, SEND_RESPONSE],
+ b"HTTP/1.1 200 OK\nSomeHeader1: val1\nSomeHeader2: val2\n\n",
+ Response(
+ status_code=200,
+ headers=[("SomeHeader1", "val1"), ("SomeHeader2", "val2")],
+ http_version="1.1",
+ reason="OK",
+ ),
+ )
+
+ # mixed \r\n and \n
+ tr(
+ READERS[SERVER, SEND_RESPONSE],
+ b"HTTP/1.1 200 OK\r\nSomeHeader1: val1\nSomeHeader2: val2\n\r\n",
+ Response(
+ status_code=200,
+ headers=[("SomeHeader1", "val1"), ("SomeHeader2", "val2")],
+ http_version="1.1",
+ reason="OK",
+ ),
+ )
+
+ # obsolete line folding
+ tr(
+ READERS[CLIENT, IDLE],
+ b"HEAD /foo HTTP/1.1\r\n"
+ b"Host: example.com\r\n"
+ b"Some: multi-line\r\n"
+ b" header\r\n"
+ b"\tnonsense\r\n"
+ b" \t \t\tI guess\r\n"
+ b"Connection: close\r\n"
+ b"More-nonsense: in the\r\n"
+ b" last header \r\n\r\n",
+ Request(
+ method="HEAD",
+ target="/foo",
+ headers=[
+ ("Host", "example.com"),
+ ("Some", "multi-line header nonsense I guess"),
+ ("Connection", "close"),
+ ("More-nonsense", "in the last header"),
+ ],
+ ),
+ )
+
+ with pytest.raises(LocalProtocolError):
+ tr(
+ READERS[CLIENT, IDLE],
+ b"HEAD /foo HTTP/1.1\r\n" b" folded: line\r\n\r\n",
+ None,
+ )
+
+ with pytest.raises(LocalProtocolError):
+ tr(
+ READERS[CLIENT, IDLE],
+ b"HEAD /foo HTTP/1.1\r\n" b"foo : line\r\n\r\n",
+ None,
+ )
+ with pytest.raises(LocalProtocolError):
+ tr(
+ READERS[CLIENT, IDLE],
+ b"HEAD /foo HTTP/1.1\r\n" b"foo\t: line\r\n\r\n",
+ None,
+ )
+ with pytest.raises(LocalProtocolError):
+ tr(
+ READERS[CLIENT, IDLE],
+ b"HEAD /foo HTTP/1.1\r\n" b"foo\t: line\r\n\r\n",
+ None,
+ )
+ with pytest.raises(LocalProtocolError):
+ tr(READERS[CLIENT, IDLE], b"HEAD /foo HTTP/1.1\r\n" b": line\r\n\r\n", None)
+
+
+def test__obsolete_line_fold_bytes():
+ # _obsolete_line_fold has a defensive cast to bytearray, which is
+ # necessary to protect against O(n^2) behavior in case anyone ever passes
+ # in regular bytestrings... but right now we never pass in regular
+ # bytestrings. so this test just exists to get some coverage on that
+ # defensive cast.
+ assert list(_obsolete_line_fold([b"aaa", b"bbb", b" ccc", b"ddd"])) == [
+ b"aaa",
+ bytearray(b"bbb ccc"),
+ b"ddd",
+ ]
+
+
+def _run_reader_iter(reader, buf, do_eof):
+ while True:
+ event = reader(buf)
+ if event is None:
+ break
+ yield event
+ # body readers have undefined behavior after returning EndOfMessage,
+ # because this changes the state so they don't get called again
+ if type(event) is EndOfMessage:
+ break
+ if do_eof:
+ assert not buf
+ yield reader.read_eof()
+
+
+def _run_reader(*args):
+ events = list(_run_reader_iter(*args))
+ return normalize_data_events(events)
+
+
+def t_body_reader(thunk, data, expected, do_eof=False):
+ # Simple: consume whole thing
+ print("Test 1")
+ buf = makebuf(data)
+ assert _run_reader(thunk(), buf, do_eof) == expected
+
+ # Incrementally growing buffer
+ print("Test 2")
+ reader = thunk()
+ buf = ReceiveBuffer()
+ events = []
+ for i in range(len(data)):
+ events += _run_reader(reader, buf, False)
+ buf += data[i : i + 1]
+ events += _run_reader(reader, buf, do_eof)
+ assert normalize_data_events(events) == expected
+
+ is_complete = any(type(event) is EndOfMessage for event in expected)
+ if is_complete and not do_eof:
+ buf = makebuf(data + b"trailing")
+ assert _run_reader(thunk(), buf, False) == expected
+
+
+def test_ContentLengthReader():
+ t_body_reader(lambda: ContentLengthReader(0), b"", [EndOfMessage()])
+
+ t_body_reader(
+ lambda: ContentLengthReader(10),
+ b"0123456789",
+ [Data(data=b"0123456789"), EndOfMessage()],
+ )
+
+
+def test_Http10Reader():
+ t_body_reader(Http10Reader, b"", [EndOfMessage()], do_eof=True)
+ t_body_reader(Http10Reader, b"asdf", [Data(data=b"asdf")], do_eof=False)
+ t_body_reader(
+ Http10Reader, b"asdf", [Data(data=b"asdf"), EndOfMessage()], do_eof=True
+ )
+
+
+def test_ChunkedReader():
+ t_body_reader(ChunkedReader, b"0\r\n\r\n", [EndOfMessage()])
+
+ t_body_reader(
+ ChunkedReader,
+ b"0\r\nSome: header\r\n\r\n",
+ [EndOfMessage(headers=[("Some", "header")])],
+ )
+
+ t_body_reader(
+ ChunkedReader,
+ b"5\r\n01234\r\n"
+ + b"10\r\n0123456789abcdef\r\n"
+ + b"0\r\n"
+ + b"Some: header\r\n\r\n",
+ [
+ Data(data=b"012340123456789abcdef"),
+ EndOfMessage(headers=[("Some", "header")]),
+ ],
+ )
+
+ t_body_reader(
+ ChunkedReader,
+ b"5\r\n01234\r\n" + b"10\r\n0123456789abcdef\r\n" + b"0\r\n\r\n",
+ [Data(data=b"012340123456789abcdef"), EndOfMessage()],
+ )
+
+ # handles upper and lowercase hex
+ t_body_reader(
+ ChunkedReader,
+ b"aA\r\n" + b"x" * 0xAA + b"\r\n" + b"0\r\n\r\n",
+ [Data(data=b"x" * 0xAA), EndOfMessage()],
+ )
+
+ # refuses arbitrarily long chunk integers
+ with pytest.raises(LocalProtocolError):
+ # Technically this is legal HTTP/1.1, but we refuse to process chunk
+ # sizes that don't fit into 20 characters of hex
+ t_body_reader(ChunkedReader, b"9" * 100 + b"\r\nxxx", [Data(data=b"xxx")])
+
+ # refuses garbage in the chunk count
+ with pytest.raises(LocalProtocolError):
+ t_body_reader(ChunkedReader, b"10\x00\r\nxxx", None)
+
+ # handles (and discards) "chunk extensions" omg wtf
+ t_body_reader(
+ ChunkedReader,
+ b"5; hello=there\r\n"
+ + b"xxxxx"
+ + b"\r\n"
+ + b'0; random="junk"; some=more; canbe=lonnnnngg\r\n\r\n',
+ [Data(data=b"xxxxx"), EndOfMessage()],
+ )
+
+
+def test_ContentLengthWriter():
+ w = ContentLengthWriter(5)
+ assert dowrite(w, Data(data=b"123")) == b"123"
+ assert dowrite(w, Data(data=b"45")) == b"45"
+ assert dowrite(w, EndOfMessage()) == b""
+
+ w = ContentLengthWriter(5)
+ with pytest.raises(LocalProtocolError):
+ dowrite(w, Data(data=b"123456"))
+
+ w = ContentLengthWriter(5)
+ dowrite(w, Data(data=b"123"))
+ with pytest.raises(LocalProtocolError):
+ dowrite(w, Data(data=b"456"))
+
+ w = ContentLengthWriter(5)
+ dowrite(w, Data(data=b"123"))
+ with pytest.raises(LocalProtocolError):
+ dowrite(w, EndOfMessage())
+
+ w = ContentLengthWriter(5)
+ dowrite(w, Data(data=b"123")) == b"123"
+ dowrite(w, Data(data=b"45")) == b"45"
+ with pytest.raises(LocalProtocolError):
+ dowrite(w, EndOfMessage(headers=[("Etag", "asdf")]))
+
+
+def test_ChunkedWriter():
+ w = ChunkedWriter()
+ assert dowrite(w, Data(data=b"aaa")) == b"3\r\naaa\r\n"
+ assert dowrite(w, Data(data=b"a" * 20)) == b"14\r\n" + b"a" * 20 + b"\r\n"
+
+ assert dowrite(w, Data(data=b"")) == b""
+
+ assert dowrite(w, EndOfMessage()) == b"0\r\n\r\n"
+
+ assert (
+ dowrite(w, EndOfMessage(headers=[("Etag", "asdf"), ("a", "b")]))
+ == b"0\r\nEtag: asdf\r\na: b\r\n\r\n"
+ )
+
+
+def test_Http10Writer():
+ w = Http10Writer()
+ assert dowrite(w, Data(data=b"1234")) == b"1234"
+ assert dowrite(w, EndOfMessage()) == b""
+
+ with pytest.raises(LocalProtocolError):
+ dowrite(w, EndOfMessage(headers=[("Etag", "asdf")]))
+
+
+def test_reject_garbage_after_request_line():
+ with pytest.raises(LocalProtocolError):
+ tr(READERS[SERVER, SEND_RESPONSE], b"HTTP/1.0 200 OK\x00xxxx\r\n\r\n", None)
+
+
+def test_reject_garbage_after_response_line():
+ with pytest.raises(LocalProtocolError):
+ tr(
+ READERS[CLIENT, IDLE],
+ b"HEAD /foo HTTP/1.1 xxxxxx\r\n" b"Host: a\r\n\r\n",
+ None,
+ )
+
+
+def test_reject_garbage_in_header_line():
+ with pytest.raises(LocalProtocolError):
+ tr(
+ READERS[CLIENT, IDLE],
+ b"HEAD /foo HTTP/1.1\r\n" b"Host: foo\x00bar\r\n\r\n",
+ None,
+ )
+
+
+def test_reject_non_vchar_in_path():
+ for bad_char in b"\x00\x20\x7f\xee":
+ message = bytearray(b"HEAD /")
+ message.append(bad_char)
+ message.extend(b" HTTP/1.1\r\nHost: foobar\r\n\r\n")
+ with pytest.raises(LocalProtocolError):
+ tr(READERS[CLIENT, IDLE], message, None)
+
+
+# https://github.com/python-hyper/h11/issues/57
+def test_allow_some_garbage_in_cookies():
+ tr(
+ READERS[CLIENT, IDLE],
+ b"HEAD /foo HTTP/1.1\r\n"
+ b"Host: foo\r\n"
+ b"Set-Cookie: ___utmvafIumyLc=kUd\x01UpAt; path=/; Max-Age=900\r\n"
+ b"\r\n",
+ Request(
+ method="HEAD",
+ target="/foo",
+ headers=[
+ ("Host", "foo"),
+ ("Set-Cookie", "___utmvafIumyLc=kUd\x01UpAt; path=/; Max-Age=900"),
+ ],
+ ),
+ )
+
+
+def test_host_comes_first():
+ tw(
+ write_headers,
+ normalize_and_validate([("foo", "bar"), ("Host", "example.com")]),
+ b"Host: example.com\r\nfoo: bar\r\n\r\n",
+ )
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/test_receivebuffer.py b/.venv/lib/python3.9/site-packages/h11/tests/test_receivebuffer.py
new file mode 100644
index 0000000..3a61f9d
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/tests/test_receivebuffer.py
@@ -0,0 +1,134 @@
+import re
+
+import pytest
+
+from .._receivebuffer import ReceiveBuffer
+
+
+def test_receivebuffer():
+ b = ReceiveBuffer()
+ assert not b
+ assert len(b) == 0
+ assert bytes(b) == b""
+
+ b += b"123"
+ assert b
+ assert len(b) == 3
+ assert bytes(b) == b"123"
+
+ assert bytes(b) == b"123"
+
+ assert b.maybe_extract_at_most(2) == b"12"
+ assert b
+ assert len(b) == 1
+ assert bytes(b) == b"3"
+
+ assert bytes(b) == b"3"
+
+ assert b.maybe_extract_at_most(10) == b"3"
+ assert bytes(b) == b""
+
+ assert b.maybe_extract_at_most(10) is None
+ assert not b
+
+ ################################################################
+ # maybe_extract_until_next
+ ################################################################
+
+ b += b"123\n456\r\n789\r\n"
+
+ assert b.maybe_extract_next_line() == b"123\n456\r\n"
+ assert bytes(b) == b"789\r\n"
+
+ assert b.maybe_extract_next_line() == b"789\r\n"
+ assert bytes(b) == b""
+
+ b += b"12\r"
+ assert b.maybe_extract_next_line() is None
+ assert bytes(b) == b"12\r"
+
+ b += b"345\n\r"
+ assert b.maybe_extract_next_line() is None
+ assert bytes(b) == b"12\r345\n\r"
+
+ # here we stopped at the middle of b"\r\n" delimiter
+
+ b += b"\n6789aaa123\r\n"
+ assert b.maybe_extract_next_line() == b"12\r345\n\r\n"
+ assert b.maybe_extract_next_line() == b"6789aaa123\r\n"
+ assert b.maybe_extract_next_line() is None
+ assert bytes(b) == b""
+
+ ################################################################
+ # maybe_extract_lines
+ ################################################################
+
+ b += b"123\r\na: b\r\nfoo:bar\r\n\r\ntrailing"
+ lines = b.maybe_extract_lines()
+ assert lines == [b"123", b"a: b", b"foo:bar"]
+ assert bytes(b) == b"trailing"
+
+ assert b.maybe_extract_lines() is None
+
+ b += b"\r\n\r"
+ assert b.maybe_extract_lines() is None
+
+ assert b.maybe_extract_at_most(100) == b"trailing\r\n\r"
+ assert not b
+
+ # Empty body case (as happens at the end of chunked encoding if there are
+ # no trailing headers, e.g.)
+ b += b"\r\ntrailing"
+ assert b.maybe_extract_lines() == []
+ assert bytes(b) == b"trailing"
+
+
+@pytest.mark.parametrize(
+ "data",
+ [
+ pytest.param(
+ (
+ b"HTTP/1.1 200 OK\r\n",
+ b"Content-type: text/plain\r\n",
+ b"Connection: close\r\n",
+ b"\r\n",
+ b"Some body",
+ ),
+ id="with_crlf_delimiter",
+ ),
+ pytest.param(
+ (
+ b"HTTP/1.1 200 OK\n",
+ b"Content-type: text/plain\n",
+ b"Connection: close\n",
+ b"\n",
+ b"Some body",
+ ),
+ id="with_lf_only_delimiter",
+ ),
+ pytest.param(
+ (
+ b"HTTP/1.1 200 OK\n",
+ b"Content-type: text/plain\r\n",
+ b"Connection: close\n",
+ b"\n",
+ b"Some body",
+ ),
+ id="with_mixed_crlf_and_lf",
+ ),
+ ],
+)
+def test_receivebuffer_for_invalid_delimiter(data):
+ b = ReceiveBuffer()
+
+ for line in data:
+ b += line
+
+ lines = b.maybe_extract_lines()
+
+ assert lines == [
+ b"HTTP/1.1 200 OK",
+ b"Content-type: text/plain",
+ b"Connection: close",
+ ]
+ assert bytes(b) == b"Some body"
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/test_state.py b/.venv/lib/python3.9/site-packages/h11/tests/test_state.py
new file mode 100644
index 0000000..efe83f0
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/tests/test_state.py
@@ -0,0 +1,250 @@
+import pytest
+
+from .._events import *
+from .._state import *
+from .._state import _SWITCH_CONNECT, _SWITCH_UPGRADE, ConnectionState
+from .._util import LocalProtocolError
+
+
+def test_ConnectionState():
+ cs = ConnectionState()
+
+ # Basic event-triggered transitions
+
+ assert cs.states == {CLIENT: IDLE, SERVER: IDLE}
+
+ cs.process_event(CLIENT, Request)
+ # The SERVER-Request special case:
+ assert cs.states == {CLIENT: SEND_BODY, SERVER: SEND_RESPONSE}
+
+ # Illegal transitions raise an error and nothing happens
+ with pytest.raises(LocalProtocolError):
+ cs.process_event(CLIENT, Request)
+ assert cs.states == {CLIENT: SEND_BODY, SERVER: SEND_RESPONSE}
+
+ cs.process_event(SERVER, InformationalResponse)
+ assert cs.states == {CLIENT: SEND_BODY, SERVER: SEND_RESPONSE}
+
+ cs.process_event(SERVER, Response)
+ assert cs.states == {CLIENT: SEND_BODY, SERVER: SEND_BODY}
+
+ cs.process_event(CLIENT, EndOfMessage)
+ cs.process_event(SERVER, EndOfMessage)
+ assert cs.states == {CLIENT: DONE, SERVER: DONE}
+
+ # State-triggered transition
+
+ cs.process_event(SERVER, ConnectionClosed)
+ assert cs.states == {CLIENT: MUST_CLOSE, SERVER: CLOSED}
+
+
+def test_ConnectionState_keep_alive():
+ # keep_alive = False
+ cs = ConnectionState()
+ cs.process_event(CLIENT, Request)
+ cs.process_keep_alive_disabled()
+ cs.process_event(CLIENT, EndOfMessage)
+ assert cs.states == {CLIENT: MUST_CLOSE, SERVER: SEND_RESPONSE}
+
+ cs.process_event(SERVER, Response)
+ cs.process_event(SERVER, EndOfMessage)
+ assert cs.states == {CLIENT: MUST_CLOSE, SERVER: MUST_CLOSE}
+
+
+def test_ConnectionState_keep_alive_in_DONE():
+ # Check that if keep_alive is disabled when the CLIENT is already in DONE,
+ # then this is sufficient to immediately trigger the DONE -> MUST_CLOSE
+ # transition
+ cs = ConnectionState()
+ cs.process_event(CLIENT, Request)
+ cs.process_event(CLIENT, EndOfMessage)
+ assert cs.states[CLIENT] is DONE
+ cs.process_keep_alive_disabled()
+ assert cs.states[CLIENT] is MUST_CLOSE
+
+
+def test_ConnectionState_switch_denied():
+ for switch_type in (_SWITCH_CONNECT, _SWITCH_UPGRADE):
+ for deny_early in (True, False):
+ cs = ConnectionState()
+ cs.process_client_switch_proposal(switch_type)
+ cs.process_event(CLIENT, Request)
+ cs.process_event(CLIENT, Data)
+ assert cs.states == {CLIENT: SEND_BODY, SERVER: SEND_RESPONSE}
+
+ assert switch_type in cs.pending_switch_proposals
+
+ if deny_early:
+ # before client reaches DONE
+ cs.process_event(SERVER, Response)
+ assert not cs.pending_switch_proposals
+
+ cs.process_event(CLIENT, EndOfMessage)
+
+ if deny_early:
+ assert cs.states == {CLIENT: DONE, SERVER: SEND_BODY}
+ else:
+ assert cs.states == {
+ CLIENT: MIGHT_SWITCH_PROTOCOL,
+ SERVER: SEND_RESPONSE,
+ }
+
+ cs.process_event(SERVER, InformationalResponse)
+ assert cs.states == {
+ CLIENT: MIGHT_SWITCH_PROTOCOL,
+ SERVER: SEND_RESPONSE,
+ }
+
+ cs.process_event(SERVER, Response)
+ assert cs.states == {CLIENT: DONE, SERVER: SEND_BODY}
+ assert not cs.pending_switch_proposals
+
+
+_response_type_for_switch = {
+ _SWITCH_UPGRADE: InformationalResponse,
+ _SWITCH_CONNECT: Response,
+ None: Response,
+}
+
+
+def test_ConnectionState_protocol_switch_accepted():
+ for switch_event in [_SWITCH_UPGRADE, _SWITCH_CONNECT]:
+ cs = ConnectionState()
+ cs.process_client_switch_proposal(switch_event)
+ cs.process_event(CLIENT, Request)
+ cs.process_event(CLIENT, Data)
+ assert cs.states == {CLIENT: SEND_BODY, SERVER: SEND_RESPONSE}
+
+ cs.process_event(CLIENT, EndOfMessage)
+ assert cs.states == {CLIENT: MIGHT_SWITCH_PROTOCOL, SERVER: SEND_RESPONSE}
+
+ cs.process_event(SERVER, InformationalResponse)
+ assert cs.states == {CLIENT: MIGHT_SWITCH_PROTOCOL, SERVER: SEND_RESPONSE}
+
+ cs.process_event(SERVER, _response_type_for_switch[switch_event], switch_event)
+ assert cs.states == {CLIENT: SWITCHED_PROTOCOL, SERVER: SWITCHED_PROTOCOL}
+
+
+def test_ConnectionState_double_protocol_switch():
+ # CONNECT + Upgrade is legal! Very silly, but legal. So we support
+ # it. Because sometimes doing the silly thing is easier than not.
+ for server_switch in [None, _SWITCH_UPGRADE, _SWITCH_CONNECT]:
+ cs = ConnectionState()
+ cs.process_client_switch_proposal(_SWITCH_UPGRADE)
+ cs.process_client_switch_proposal(_SWITCH_CONNECT)
+ cs.process_event(CLIENT, Request)
+ cs.process_event(CLIENT, EndOfMessage)
+ assert cs.states == {CLIENT: MIGHT_SWITCH_PROTOCOL, SERVER: SEND_RESPONSE}
+ cs.process_event(
+ SERVER, _response_type_for_switch[server_switch], server_switch
+ )
+ if server_switch is None:
+ assert cs.states == {CLIENT: DONE, SERVER: SEND_BODY}
+ else:
+ assert cs.states == {CLIENT: SWITCHED_PROTOCOL, SERVER: SWITCHED_PROTOCOL}
+
+
+def test_ConnectionState_inconsistent_protocol_switch():
+ for client_switches, server_switch in [
+ ([], _SWITCH_CONNECT),
+ ([], _SWITCH_UPGRADE),
+ ([_SWITCH_UPGRADE], _SWITCH_CONNECT),
+ ([_SWITCH_CONNECT], _SWITCH_UPGRADE),
+ ]:
+ cs = ConnectionState()
+ for client_switch in client_switches:
+ cs.process_client_switch_proposal(client_switch)
+ cs.process_event(CLIENT, Request)
+ with pytest.raises(LocalProtocolError):
+ cs.process_event(SERVER, Response, server_switch)
+
+
+def test_ConnectionState_keepalive_protocol_switch_interaction():
+ # keep_alive=False + pending_switch_proposals
+ cs = ConnectionState()
+ cs.process_client_switch_proposal(_SWITCH_UPGRADE)
+ cs.process_event(CLIENT, Request)
+ cs.process_keep_alive_disabled()
+ cs.process_event(CLIENT, Data)
+ assert cs.states == {CLIENT: SEND_BODY, SERVER: SEND_RESPONSE}
+
+ # the protocol switch "wins"
+ cs.process_event(CLIENT, EndOfMessage)
+ assert cs.states == {CLIENT: MIGHT_SWITCH_PROTOCOL, SERVER: SEND_RESPONSE}
+
+ # but when the server denies the request, keep_alive comes back into play
+ cs.process_event(SERVER, Response)
+ assert cs.states == {CLIENT: MUST_CLOSE, SERVER: SEND_BODY}
+
+
+def test_ConnectionState_reuse():
+ cs = ConnectionState()
+
+ with pytest.raises(LocalProtocolError):
+ cs.start_next_cycle()
+
+ cs.process_event(CLIENT, Request)
+ cs.process_event(CLIENT, EndOfMessage)
+
+ with pytest.raises(LocalProtocolError):
+ cs.start_next_cycle()
+
+ cs.process_event(SERVER, Response)
+ cs.process_event(SERVER, EndOfMessage)
+
+ cs.start_next_cycle()
+ assert cs.states == {CLIENT: IDLE, SERVER: IDLE}
+
+ # No keepalive
+
+ cs.process_event(CLIENT, Request)
+ cs.process_keep_alive_disabled()
+ cs.process_event(CLIENT, EndOfMessage)
+ cs.process_event(SERVER, Response)
+ cs.process_event(SERVER, EndOfMessage)
+
+ with pytest.raises(LocalProtocolError):
+ cs.start_next_cycle()
+
+ # One side closed
+
+ cs = ConnectionState()
+ cs.process_event(CLIENT, Request)
+ cs.process_event(CLIENT, EndOfMessage)
+ cs.process_event(CLIENT, ConnectionClosed)
+ cs.process_event(SERVER, Response)
+ cs.process_event(SERVER, EndOfMessage)
+
+ with pytest.raises(LocalProtocolError):
+ cs.start_next_cycle()
+
+ # Succesful protocol switch
+
+ cs = ConnectionState()
+ cs.process_client_switch_proposal(_SWITCH_UPGRADE)
+ cs.process_event(CLIENT, Request)
+ cs.process_event(CLIENT, EndOfMessage)
+ cs.process_event(SERVER, InformationalResponse, _SWITCH_UPGRADE)
+
+ with pytest.raises(LocalProtocolError):
+ cs.start_next_cycle()
+
+ # Failed protocol switch
+
+ cs = ConnectionState()
+ cs.process_client_switch_proposal(_SWITCH_UPGRADE)
+ cs.process_event(CLIENT, Request)
+ cs.process_event(CLIENT, EndOfMessage)
+ cs.process_event(SERVER, Response)
+ cs.process_event(SERVER, EndOfMessage)
+
+ cs.start_next_cycle()
+ assert cs.states == {CLIENT: IDLE, SERVER: IDLE}
+
+
+def test_server_request_is_illegal():
+ # There used to be a bug in how we handled the Request special case that
+ # made this allowed...
+ cs = ConnectionState()
+ with pytest.raises(LocalProtocolError):
+ cs.process_event(SERVER, Request)
diff --git a/.venv/lib/python3.9/site-packages/h11/tests/test_util.py b/.venv/lib/python3.9/site-packages/h11/tests/test_util.py
new file mode 100644
index 0000000..d851bdc
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h11/tests/test_util.py
@@ -0,0 +1,99 @@
+import re
+import sys
+import traceback
+
+import pytest
+
+from .._util import *
+
+
+def test_ProtocolError():
+ with pytest.raises(TypeError):
+ ProtocolError("abstract base class")
+
+
+def test_LocalProtocolError():
+ try:
+ raise LocalProtocolError("foo")
+ except LocalProtocolError as e:
+ assert str(e) == "foo"
+ assert e.error_status_hint == 400
+
+ try:
+ raise LocalProtocolError("foo", error_status_hint=418)
+ except LocalProtocolError as e:
+ assert str(e) == "foo"
+ assert e.error_status_hint == 418
+
+ def thunk():
+ raise LocalProtocolError("a", error_status_hint=420)
+
+ try:
+ try:
+ thunk()
+ except LocalProtocolError as exc1:
+ orig_traceback = "".join(traceback.format_tb(sys.exc_info()[2]))
+ exc1._reraise_as_remote_protocol_error()
+ except RemoteProtocolError as exc2:
+ assert type(exc2) is RemoteProtocolError
+ assert exc2.args == ("a",)
+ assert exc2.error_status_hint == 420
+ new_traceback = "".join(traceback.format_tb(sys.exc_info()[2]))
+ assert new_traceback.endswith(orig_traceback)
+
+
+def test_validate():
+ my_re = re.compile(br"(?P[0-9]+)\.(?P[0-9]+)")
+ with pytest.raises(LocalProtocolError):
+ validate(my_re, b"0.")
+
+ groups = validate(my_re, b"0.1")
+ assert groups == {"group1": b"0", "group2": b"1"}
+
+ # successful partial matches are an error - must match whole string
+ with pytest.raises(LocalProtocolError):
+ validate(my_re, b"0.1xx")
+ with pytest.raises(LocalProtocolError):
+ validate(my_re, b"0.1\n")
+
+
+def test_validate_formatting():
+ my_re = re.compile(br"foo")
+
+ with pytest.raises(LocalProtocolError) as excinfo:
+ validate(my_re, b"", "oops")
+ assert "oops" in str(excinfo.value)
+
+ with pytest.raises(LocalProtocolError) as excinfo:
+ validate(my_re, b"", "oops {}")
+ assert "oops {}" in str(excinfo.value)
+
+ with pytest.raises(LocalProtocolError) as excinfo:
+ validate(my_re, b"", "oops {} xx", 10)
+ assert "oops 10 xx" in str(excinfo.value)
+
+
+def test_make_sentinel():
+ S = make_sentinel("S")
+ assert repr(S) == "S"
+ assert S == S
+ assert type(S).__name__ == "S"
+ assert S in {S}
+ assert type(S) is S
+ S2 = make_sentinel("S2")
+ assert repr(S2) == "S2"
+ assert S != S2
+ assert S not in {S2}
+ assert type(S) is not type(S2)
+
+
+def test_bytesify():
+ assert bytesify(b"123") == b"123"
+ assert bytesify(bytearray(b"123")) == b"123"
+ assert bytesify("123") == b"123"
+
+ with pytest.raises(UnicodeEncodeError):
+ bytesify("\u1234")
+
+ with pytest.raises(TypeError):
+ bytesify(10)
diff --git a/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/INSTALLER b/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/INSTALLER
new file mode 100644
index 0000000..a1b589e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/INSTALLER
@@ -0,0 +1 @@
+pip
diff --git a/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/LICENSE b/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/LICENSE
new file mode 100644
index 0000000..db41662
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/LICENSE
@@ -0,0 +1,21 @@
+The MIT License (MIT)
+
+Copyright (c) 2015-2020 Cory Benfield and contributors
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in
+all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+THE SOFTWARE.
diff --git a/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/METADATA b/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/METADATA
new file mode 100644
index 0000000..61fab4c
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/METADATA
@@ -0,0 +1,102 @@
+Metadata-Version: 2.1
+Name: h2
+Version: 4.0.0
+Summary: HTTP/2 State-Machine based protocol implementation
+Home-page: https://github.com/python-hyper/hyper-h2
+Author: Cory Benfield
+Author-email: cory@lukasa.co.uk
+License: MIT License
+Platform: UNKNOWN
+Classifier: Development Status :: 5 - Production/Stable
+Classifier: Intended Audience :: Developers
+Classifier: License :: OSI Approved :: MIT License
+Classifier: Programming Language :: Python
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3.6
+Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Programming Language :: Python :: Implementation :: CPython
+Classifier: Programming Language :: Python :: Implementation :: PyPy
+Requires-Python: >=3.6.1
+Description-Content-Type: text/x-rst
+Requires-Dist: hyperframe (<7,>=6.0)
+Requires-Dist: hpack (<5,>=4.0)
+
+===============================
+hyper-h2: HTTP/2 Protocol Stack
+===============================
+
+.. image:: https://github.com/python-hyper/hyper-h2/workflows/CI/badge.svg
+ :target: https://github.com/python-hyper/hyper-h2/actions
+ :alt: Build Status
+.. image:: https://codecov.io/gh/python-hyper/hyper-h2/branch/master/graph/badge.svg
+ :target: https://codecov.io/gh/python-hyper/hyper-h2
+ :alt: Code Coverage
+.. image:: https://readthedocs.org/projects/hyper-h2/badge/?version=latest
+ :target: https://hyper-h2.readthedocs.io/en/latest/
+ :alt: Documentation Status
+.. image:: https://img.shields.io/badge/chat-join_now-brightgreen.svg
+ :target: https://gitter.im/python-hyper/community
+ :alt: Chat community
+
+.. image:: https://raw.github.com/Lukasa/hyper/development/docs/source/images/hyper.png
+
+This repository contains a pure-Python implementation of a HTTP/2 protocol
+stack. It's written from the ground up to be embeddable in whatever program you
+choose to use, ensuring that you can speak HTTP/2 regardless of your
+programming paradigm.
+
+You use it like this:
+
+.. code-block:: python
+
+ import h2.connection
+ import h2.config
+
+ config = h2.config.H2Configuration()
+ conn = h2.connection.H2Connection(config=config)
+ conn.send_headers(stream_id=stream_id, headers=headers)
+ conn.send_data(stream_id, data)
+ socket.sendall(conn.data_to_send())
+ events = conn.receive_data(socket_data)
+
+This repository does not provide a parsing layer, a network layer, or any rules
+about concurrency. Instead, it's a purely in-memory solution, defined in terms
+of data actions and HTTP/2 frames. This is one building block of a full Python
+HTTP implementation.
+
+To install it, just run:
+
+.. code-block:: console
+
+ $ pip install h2
+
+Documentation
+=============
+
+Documentation is available at https://hyper-h2.readthedocs.io/ .
+
+Contributing
+============
+
+``hyper-h2`` welcomes contributions from anyone! Unlike many other projects we
+are happy to accept cosmetic contributions and small contributions, in addition
+to large feature requests and changes.
+
+Before you contribute (either by opening an issue or filing a pull request),
+please `read the contribution guidelines`_.
+
+.. _read the contribution guidelines: http://python-hyper.org/en/latest/contributing.html
+
+License
+=======
+
+``hyper-h2`` is made available under the MIT License. For more details, see the
+``LICENSE`` file in the repository.
+
+Authors
+=======
+
+``hyper-h2`` is maintained by Cory Benfield, with contributions from others.
+
+
diff --git a/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/RECORD b/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/RECORD
new file mode 100644
index 0000000..f2a158b
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/RECORD
@@ -0,0 +1,28 @@
+h2-4.0.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
+h2-4.0.0.dist-info/LICENSE,sha256=emWlrwy6vxwWJRx8ayt8tG0WpyIueZdbm2H81mouPyg,1102
+h2-4.0.0.dist-info/METADATA,sha256=VtPSt_aNYdJDHEQb00nMiG5ZiO1Otdt3l1nqznebyVk,3453
+h2-4.0.0.dist-info/RECORD,,
+h2-4.0.0.dist-info/WHEEL,sha256=g4nMs7d-Xl9-xC9XovUrsDHGXt-FT0E17Yqo92DEfvY,92
+h2-4.0.0.dist-info/top_level.txt,sha256=Hiulx8KxI2jFUM1dG7-CZeRkO3j50MBwCLG36Vrq-kI,3
+h2/__init__.py,sha256=B5BGVwobBOin_q7r9t6Es8oIGENwSevA-O8DQBmunTo,92
+h2/__pycache__/__init__.cpython-39.pyc,,
+h2/__pycache__/config.cpython-39.pyc,,
+h2/__pycache__/connection.cpython-39.pyc,,
+h2/__pycache__/errors.cpython-39.pyc,,
+h2/__pycache__/events.cpython-39.pyc,,
+h2/__pycache__/exceptions.cpython-39.pyc,,
+h2/__pycache__/frame_buffer.cpython-39.pyc,,
+h2/__pycache__/settings.cpython-39.pyc,,
+h2/__pycache__/stream.cpython-39.pyc,,
+h2/__pycache__/utilities.cpython-39.pyc,,
+h2/__pycache__/windows.cpython-39.pyc,,
+h2/config.py,sha256=oYy4AU3BxDo4l8JuHi8_JCDeXqhoxMOIKZ2jBNDD5eg,6600
+h2/connection.py,sha256=NZzpixAnPu6G00xDANsBENSepJ4rNORTFa-fJKh2hhs,82993
+h2/errors.py,sha256=6KzmbyYSKM9RuCd6nz7P2LtxmlSs5P2mneS_Sa8vTEo,1543
+h2/events.py,sha256=CinrMuXmvFrlPDxJZY14FV-JyBGarxRR6gZKGVEJym0,21653
+h2/exceptions.py,sha256=OlfeNH_fcGGM0rFlOpD2JGOkW6r0HxDN7nSeGzO-Hz0,5334
+h2/frame_buffer.py,sha256=7i15oPLqfAPPfTDE5A87gggW4FxB7l__vMX4Xe9R7TU,6208
+h2/settings.py,sha256=zRs_bAz9qg21jsp6HQY-TZWncIbCDwEJ_1wZOydmYNc,11690
+h2/stream.py,sha256=qaOEx0XaSaOk3_P7Beym9dIoCKb2UDBI-3ZNMMV2LoU,54539
+h2/utilities.py,sha256=CfVexJ-CU1KbArHoIpBeHJhLmWoiPyHazSf2_mDvN2Q,22699
+h2/windows.py,sha256=N_P9mTi0VZ26-eY2QsvWYi6NLx73xswvUZb7VGod1zk,5595
diff --git a/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/WHEEL b/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/WHEEL
new file mode 100644
index 0000000..b552003
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/WHEEL
@@ -0,0 +1,5 @@
+Wheel-Version: 1.0
+Generator: bdist_wheel (0.34.2)
+Root-Is-Purelib: true
+Tag: py3-none-any
+
diff --git a/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/top_level.txt b/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/top_level.txt
new file mode 100644
index 0000000..c48b563
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2-4.0.0.dist-info/top_level.txt
@@ -0,0 +1 @@
+h2
diff --git a/.venv/lib/python3.9/site-packages/h2/__init__.py b/.venv/lib/python3.9/site-packages/h2/__init__.py
new file mode 100644
index 0000000..6d9e28e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2/__init__.py
@@ -0,0 +1,8 @@
+# -*- coding: utf-8 -*-
+"""
+hyper-h2
+~~
+
+A HTTP/2 implementation.
+"""
+__version__ = '4.0.0'
diff --git a/.venv/lib/python3.9/site-packages/h2/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h2/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..fac8aed
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h2/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h2/__pycache__/config.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h2/__pycache__/config.cpython-39.pyc
new file mode 100644
index 0000000..a6aa3c0
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h2/__pycache__/config.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h2/__pycache__/connection.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h2/__pycache__/connection.cpython-39.pyc
new file mode 100644
index 0000000..a44dab4
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h2/__pycache__/connection.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h2/__pycache__/errors.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h2/__pycache__/errors.cpython-39.pyc
new file mode 100644
index 0000000..9e6783e
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h2/__pycache__/errors.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h2/__pycache__/events.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h2/__pycache__/events.cpython-39.pyc
new file mode 100644
index 0000000..48ac5f4
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h2/__pycache__/events.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h2/__pycache__/exceptions.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h2/__pycache__/exceptions.cpython-39.pyc
new file mode 100644
index 0000000..d8f0f66
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h2/__pycache__/exceptions.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h2/__pycache__/frame_buffer.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h2/__pycache__/frame_buffer.cpython-39.pyc
new file mode 100644
index 0000000..376d9fc
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h2/__pycache__/frame_buffer.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h2/__pycache__/settings.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h2/__pycache__/settings.cpython-39.pyc
new file mode 100644
index 0000000..a29c67d
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h2/__pycache__/settings.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h2/__pycache__/stream.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h2/__pycache__/stream.cpython-39.pyc
new file mode 100644
index 0000000..200bad6
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h2/__pycache__/stream.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h2/__pycache__/utilities.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h2/__pycache__/utilities.cpython-39.pyc
new file mode 100644
index 0000000..f75f4cf
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h2/__pycache__/utilities.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h2/__pycache__/windows.cpython-39.pyc b/.venv/lib/python3.9/site-packages/h2/__pycache__/windows.cpython-39.pyc
new file mode 100644
index 0000000..a56191a
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/h2/__pycache__/windows.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/h2/config.py b/.venv/lib/python3.9/site-packages/h2/config.py
new file mode 100644
index 0000000..730b611
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2/config.py
@@ -0,0 +1,170 @@
+# -*- coding: utf-8 -*-
+"""
+h2/config
+~~~~~~~~~
+
+Objects for controlling the configuration of the HTTP/2 stack.
+"""
+
+
+class _BooleanConfigOption:
+ """
+ Descriptor for handling a boolean config option. This will block
+ attempts to set boolean config options to non-bools.
+ """
+ def __init__(self, name):
+ self.name = name
+ self.attr_name = '_%s' % self.name
+
+ def __get__(self, instance, owner):
+ return getattr(instance, self.attr_name)
+
+ def __set__(self, instance, value):
+ if not isinstance(value, bool):
+ raise ValueError("%s must be a bool" % self.name)
+ setattr(instance, self.attr_name, value)
+
+
+class DummyLogger:
+ """
+ An Logger object that does not actual logging, hence a DummyLogger.
+
+ For the class the log operation is merely a no-op. The intent is to avoid
+ conditionals being sprinkled throughout the hyper-h2 code for calls to
+ logging functions when no logger is passed into the corresponding object.
+ """
+ def __init__(self, *vargs):
+ pass
+
+ def debug(self, *vargs, **kwargs):
+ """
+ No-op logging. Only level needed for now.
+ """
+ pass
+
+ def trace(self, *vargs, **kwargs):
+ """
+ No-op logging. Only level needed for now.
+ """
+ pass
+
+
+class H2Configuration:
+ """
+ An object that controls the way a single HTTP/2 connection behaves.
+
+ This object allows the users to customize behaviour. In particular, it
+ allows users to enable or disable optional features, or to otherwise handle
+ various unusual behaviours.
+
+ This object has very little behaviour of its own: it mostly just ensures
+ that configuration is self-consistent.
+
+ :param client_side: Whether this object is to be used on the client side of
+ a connection, or on the server side. Affects the logic used by the
+ state machine, the default settings values, the allowable stream IDs,
+ and several other properties. Defaults to ``True``.
+ :type client_side: ``bool``
+
+ :param header_encoding: Controls whether the headers emitted by this object
+ in events are transparently decoded to ``unicode`` strings, and what
+ encoding is used to do that decoding. This defaults to ``None``,
+ meaning that headers will be returned as bytes. To automatically
+ decode headers (that is, to return them as unicode strings), this can
+ be set to the string name of any encoding, e.g. ``'utf-8'``.
+
+ .. versionchanged:: 3.0.0
+ Changed default value from ``'utf-8'`` to ``None``
+
+ :type header_encoding: ``str``, ``False``, or ``None``
+
+ :param validate_outbound_headers: Controls whether the headers emitted
+ by this object are validated against the rules in RFC 7540.
+ Disabling this setting will cause outbound header validation to
+ be skipped, and allow the object to emit headers that may be illegal
+ according to RFC 7540. Defaults to ``True``.
+ :type validate_outbound_headers: ``bool``
+
+ :param normalize_outbound_headers: Controls whether the headers emitted
+ by this object are normalized before sending. Disabling this setting
+ will cause outbound header normalization to be skipped, and allow
+ the object to emit headers that may be illegal according to
+ RFC 7540. Defaults to ``True``.
+ :type normalize_outbound_headers: ``bool``
+
+ :param validate_inbound_headers: Controls whether the headers received
+ by this object are validated against the rules in RFC 7540.
+ Disabling this setting will cause inbound header validation to
+ be skipped, and allow the object to receive headers that may be illegal
+ according to RFC 7540. Defaults to ``True``.
+ :type validate_inbound_headers: ``bool``
+
+ :param normalize_inbound_headers: Controls whether the headers received by
+ this object are normalized according to the rules of RFC 7540.
+ Disabling this setting may lead to hyper-h2 emitting header blocks that
+ some RFCs forbid, e.g. with multiple cookie fields.
+
+ .. versionadded:: 3.0.0
+
+ :type normalize_inbound_headers: ``bool``
+
+ :param logger: A logger that conforms to the requirements for this module,
+ those being no I/O and no context switches, which is needed in order
+ to run in asynchronous operation.
+
+ .. versionadded:: 2.6.0
+
+ :type logger: ``logging.Logger``
+ """
+ client_side = _BooleanConfigOption('client_side')
+ validate_outbound_headers = _BooleanConfigOption(
+ 'validate_outbound_headers'
+ )
+ normalize_outbound_headers = _BooleanConfigOption(
+ 'normalize_outbound_headers'
+ )
+ validate_inbound_headers = _BooleanConfigOption(
+ 'validate_inbound_headers'
+ )
+ normalize_inbound_headers = _BooleanConfigOption(
+ 'normalize_inbound_headers'
+ )
+
+ def __init__(self,
+ client_side=True,
+ header_encoding=None,
+ validate_outbound_headers=True,
+ normalize_outbound_headers=True,
+ validate_inbound_headers=True,
+ normalize_inbound_headers=True,
+ logger=None):
+ self.client_side = client_side
+ self.header_encoding = header_encoding
+ self.validate_outbound_headers = validate_outbound_headers
+ self.normalize_outbound_headers = normalize_outbound_headers
+ self.validate_inbound_headers = validate_inbound_headers
+ self.normalize_inbound_headers = normalize_inbound_headers
+ self.logger = logger or DummyLogger(__name__)
+
+ @property
+ def header_encoding(self):
+ """
+ Controls whether the headers emitted by this object in events are
+ transparently decoded to ``unicode`` strings, and what encoding is used
+ to do that decoding. This defaults to ``None``, meaning that headers
+ will be returned as bytes. To automatically decode headers (that is, to
+ return them as unicode strings), this can be set to the string name of
+ any encoding, e.g. ``'utf-8'``.
+ """
+ return self._header_encoding
+
+ @header_encoding.setter
+ def header_encoding(self, value):
+ """
+ Enforces constraints on the value of header encoding.
+ """
+ if not isinstance(value, (bool, str, type(None))):
+ raise ValueError("header_encoding must be bool, string, or None")
+ if value is True:
+ raise ValueError("header_encoding cannot be True")
+ self._header_encoding = value
diff --git a/.venv/lib/python3.9/site-packages/h2/connection.py b/.venv/lib/python3.9/site-packages/h2/connection.py
new file mode 100644
index 0000000..aa30711
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2/connection.py
@@ -0,0 +1,2047 @@
+# -*- coding: utf-8 -*-
+"""
+h2/connection
+~~~~~~~~~~~~~
+
+An implementation of a HTTP/2 connection.
+"""
+import base64
+
+from enum import Enum, IntEnum
+
+from hyperframe.exceptions import InvalidPaddingError
+from hyperframe.frame import (
+ GoAwayFrame, WindowUpdateFrame, HeadersFrame, DataFrame, PingFrame,
+ PushPromiseFrame, SettingsFrame, RstStreamFrame, PriorityFrame,
+ ContinuationFrame, AltSvcFrame, ExtensionFrame
+)
+from hpack.hpack import Encoder, Decoder
+from hpack.exceptions import HPACKError, OversizedHeaderListError
+
+from .config import H2Configuration
+from .errors import ErrorCodes, _error_code_from_int
+from .events import (
+ WindowUpdated, RemoteSettingsChanged, PingReceived, PingAckReceived,
+ SettingsAcknowledged, ConnectionTerminated, PriorityUpdated,
+ AlternativeServiceAvailable, UnknownFrameReceived
+)
+from .exceptions import (
+ ProtocolError, NoSuchStreamError, FlowControlError, FrameTooLargeError,
+ TooManyStreamsError, StreamClosedError, StreamIDTooLowError,
+ NoAvailableStreamIDError, RFC1122Error, DenialOfServiceError
+)
+from .frame_buffer import FrameBuffer
+from .settings import Settings, SettingCodes
+from .stream import H2Stream, StreamClosedBy
+from .utilities import SizeLimitDict, guard_increment_window
+from .windows import WindowManager
+
+
+class ConnectionState(Enum):
+ IDLE = 0
+ CLIENT_OPEN = 1
+ SERVER_OPEN = 2
+ CLOSED = 3
+
+
+class ConnectionInputs(Enum):
+ SEND_HEADERS = 0
+ SEND_PUSH_PROMISE = 1
+ SEND_DATA = 2
+ SEND_GOAWAY = 3
+ SEND_WINDOW_UPDATE = 4
+ SEND_PING = 5
+ SEND_SETTINGS = 6
+ SEND_RST_STREAM = 7
+ SEND_PRIORITY = 8
+ RECV_HEADERS = 9
+ RECV_PUSH_PROMISE = 10
+ RECV_DATA = 11
+ RECV_GOAWAY = 12
+ RECV_WINDOW_UPDATE = 13
+ RECV_PING = 14
+ RECV_SETTINGS = 15
+ RECV_RST_STREAM = 16
+ RECV_PRIORITY = 17
+ SEND_ALTERNATIVE_SERVICE = 18 # Added in 2.3.0
+ RECV_ALTERNATIVE_SERVICE = 19 # Added in 2.3.0
+
+
+class AllowedStreamIDs(IntEnum):
+ EVEN = 0
+ ODD = 1
+
+
+class H2ConnectionStateMachine:
+ """
+ A single HTTP/2 connection state machine.
+
+ This state machine, while defined in its own class, is logically part of
+ the H2Connection class also defined in this file. The state machine itself
+ maintains very little state directly, instead focusing entirely on managing
+ state transitions.
+ """
+ # For the purposes of this state machine we treat HEADERS and their
+ # associated CONTINUATION frames as a single jumbo frame. The protocol
+ # allows/requires this by preventing other frames from being interleved in
+ # between HEADERS/CONTINUATION frames.
+ #
+ # The _transitions dictionary contains a mapping of tuples of
+ # (state, input) to tuples of (side_effect_function, end_state). This map
+ # contains all allowed transitions: anything not in this map is invalid
+ # and immediately causes a transition to ``closed``.
+
+ _transitions = {
+ # State: idle
+ (ConnectionState.IDLE, ConnectionInputs.SEND_HEADERS):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.IDLE, ConnectionInputs.RECV_HEADERS):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.IDLE, ConnectionInputs.SEND_SETTINGS):
+ (None, ConnectionState.IDLE),
+ (ConnectionState.IDLE, ConnectionInputs.RECV_SETTINGS):
+ (None, ConnectionState.IDLE),
+ (ConnectionState.IDLE, ConnectionInputs.SEND_WINDOW_UPDATE):
+ (None, ConnectionState.IDLE),
+ (ConnectionState.IDLE, ConnectionInputs.RECV_WINDOW_UPDATE):
+ (None, ConnectionState.IDLE),
+ (ConnectionState.IDLE, ConnectionInputs.SEND_PING):
+ (None, ConnectionState.IDLE),
+ (ConnectionState.IDLE, ConnectionInputs.RECV_PING):
+ (None, ConnectionState.IDLE),
+ (ConnectionState.IDLE, ConnectionInputs.SEND_GOAWAY):
+ (None, ConnectionState.CLOSED),
+ (ConnectionState.IDLE, ConnectionInputs.RECV_GOAWAY):
+ (None, ConnectionState.CLOSED),
+ (ConnectionState.IDLE, ConnectionInputs.SEND_PRIORITY):
+ (None, ConnectionState.IDLE),
+ (ConnectionState.IDLE, ConnectionInputs.RECV_PRIORITY):
+ (None, ConnectionState.IDLE),
+ (ConnectionState.IDLE, ConnectionInputs.SEND_ALTERNATIVE_SERVICE):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.IDLE, ConnectionInputs.RECV_ALTERNATIVE_SERVICE):
+ (None, ConnectionState.CLIENT_OPEN),
+
+ # State: open, client side.
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.SEND_HEADERS):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.SEND_DATA):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.SEND_GOAWAY):
+ (None, ConnectionState.CLOSED),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.SEND_WINDOW_UPDATE):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.SEND_PING):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.SEND_SETTINGS):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.SEND_PRIORITY):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.RECV_HEADERS):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.RECV_PUSH_PROMISE):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.RECV_DATA):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.RECV_GOAWAY):
+ (None, ConnectionState.CLOSED),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.RECV_WINDOW_UPDATE):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.RECV_PING):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.RECV_SETTINGS):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.SEND_RST_STREAM):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.RECV_RST_STREAM):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.CLIENT_OPEN, ConnectionInputs.RECV_PRIORITY):
+ (None, ConnectionState.CLIENT_OPEN),
+ (ConnectionState.CLIENT_OPEN,
+ ConnectionInputs.RECV_ALTERNATIVE_SERVICE):
+ (None, ConnectionState.CLIENT_OPEN),
+
+ # State: open, server side.
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.SEND_HEADERS):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.SEND_PUSH_PROMISE):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.SEND_DATA):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.SEND_GOAWAY):
+ (None, ConnectionState.CLOSED),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.SEND_WINDOW_UPDATE):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.SEND_PING):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.SEND_SETTINGS):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.SEND_PRIORITY):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.RECV_HEADERS):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.RECV_DATA):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.RECV_GOAWAY):
+ (None, ConnectionState.CLOSED),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.RECV_WINDOW_UPDATE):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.RECV_PING):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.RECV_SETTINGS):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.RECV_PRIORITY):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.SEND_RST_STREAM):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN, ConnectionInputs.RECV_RST_STREAM):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN,
+ ConnectionInputs.SEND_ALTERNATIVE_SERVICE):
+ (None, ConnectionState.SERVER_OPEN),
+ (ConnectionState.SERVER_OPEN,
+ ConnectionInputs.RECV_ALTERNATIVE_SERVICE):
+ (None, ConnectionState.SERVER_OPEN),
+
+ # State: closed
+ (ConnectionState.CLOSED, ConnectionInputs.SEND_GOAWAY):
+ (None, ConnectionState.CLOSED),
+ (ConnectionState.CLOSED, ConnectionInputs.RECV_GOAWAY):
+ (None, ConnectionState.CLOSED),
+ }
+
+ def __init__(self):
+ self.state = ConnectionState.IDLE
+
+ def process_input(self, input_):
+ """
+ Process a specific input in the state machine.
+ """
+ if not isinstance(input_, ConnectionInputs):
+ raise ValueError("Input must be an instance of ConnectionInputs")
+
+ try:
+ func, target_state = self._transitions[(self.state, input_)]
+ except KeyError:
+ old_state = self.state
+ self.state = ConnectionState.CLOSED
+ raise ProtocolError(
+ "Invalid input %s in state %s" % (input_, old_state)
+ )
+ else:
+ self.state = target_state
+ if func is not None: # pragma: no cover
+ return func()
+
+ return []
+
+
+class H2Connection:
+ """
+ A low-level HTTP/2 connection object. This handles building and receiving
+ frames and maintains both connection and per-stream state for all streams
+ on this connection.
+
+ This wraps a HTTP/2 Connection state machine implementation, ensuring that
+ frames can only be sent/received when the connection is in a valid state.
+ It also builds stream state machines on demand to ensure that the
+ constraints of those state machines are met as well. Attempts to create
+ frames that cannot be sent will raise a ``ProtocolError``.
+
+ .. versionchanged:: 2.3.0
+ Added the ``header_encoding`` keyword argument.
+
+ .. versionchanged:: 2.5.0
+ Added the ``config`` keyword argument. Deprecated the ``client_side``
+ and ``header_encoding`` parameters.
+
+ .. versionchanged:: 3.0.0
+ Removed deprecated parameters and properties.
+
+ :param config: The configuration for the HTTP/2 connection.
+
+ .. versionadded:: 2.5.0
+
+ :type config: :class:`H2Configuration `
+ """
+ # The initial maximum outbound frame size. This can be changed by receiving
+ # a settings frame.
+ DEFAULT_MAX_OUTBOUND_FRAME_SIZE = 65535
+
+ # The initial maximum inbound frame size. This is somewhat arbitrarily
+ # chosen.
+ DEFAULT_MAX_INBOUND_FRAME_SIZE = 2**24
+
+ # The highest acceptable stream ID.
+ HIGHEST_ALLOWED_STREAM_ID = 2**31 - 1
+
+ # The largest acceptable window increment.
+ MAX_WINDOW_INCREMENT = 2**31 - 1
+
+ # The initial default value of SETTINGS_MAX_HEADER_LIST_SIZE.
+ DEFAULT_MAX_HEADER_LIST_SIZE = 2**16
+
+ # Keep in memory limited amount of results for streams closes
+ MAX_CLOSED_STREAMS = 2**16
+
+ def __init__(self, config=None):
+ self.state_machine = H2ConnectionStateMachine()
+ self.streams = {}
+ self.highest_inbound_stream_id = 0
+ self.highest_outbound_stream_id = 0
+ self.encoder = Encoder()
+ self.decoder = Decoder()
+
+ # This won't always actually do anything: for versions of HPACK older
+ # than 2.3.0 it does nothing. However, we have to try!
+ self.decoder.max_header_list_size = self.DEFAULT_MAX_HEADER_LIST_SIZE
+
+ #: The configuration for this HTTP/2 connection object.
+ #:
+ #: .. versionadded:: 2.5.0
+ self.config = config
+ if self.config is None:
+ self.config = H2Configuration(
+ client_side=True,
+ )
+
+ # Objects that store settings, including defaults.
+ #
+ # We set the MAX_CONCURRENT_STREAMS value to 100 because its default is
+ # unbounded, and that's a dangerous default because it allows
+ # essentially unbounded resources to be allocated regardless of how
+ # they will be used. 100 should be suitable for the average
+ # application. This default obviously does not apply to the remote
+ # peer's settings: the remote peer controls them!
+ #
+ # We also set MAX_HEADER_LIST_SIZE to a reasonable value. This is to
+ # advertise our defence against CVE-2016-6581. However, not all
+ # versions of HPACK will let us do it. That's ok: we should at least
+ # suggest that we're not vulnerable.
+ self.local_settings = Settings(
+ client=self.config.client_side,
+ initial_values={
+ SettingCodes.MAX_CONCURRENT_STREAMS: 100,
+ SettingCodes.MAX_HEADER_LIST_SIZE:
+ self.DEFAULT_MAX_HEADER_LIST_SIZE,
+ }
+ )
+ self.remote_settings = Settings(client=not self.config.client_side)
+
+ # The current value of the connection flow control windows on the
+ # connection.
+ self.outbound_flow_control_window = (
+ self.remote_settings.initial_window_size
+ )
+
+ #: The maximum size of a frame that can be emitted by this peer, in
+ #: bytes.
+ self.max_outbound_frame_size = self.remote_settings.max_frame_size
+
+ #: The maximum size of a frame that can be received by this peer, in
+ #: bytes.
+ self.max_inbound_frame_size = self.local_settings.max_frame_size
+
+ # Buffer for incoming data.
+ self.incoming_buffer = FrameBuffer(server=not self.config.client_side)
+
+ # A private variable to store a sequence of received header frames
+ # until completion.
+ self._header_frames = []
+
+ # Data that needs to be sent.
+ self._data_to_send = bytearray()
+
+ # Keeps track of how streams are closed.
+ # Used to ensure that we don't blow up in the face of frames that were
+ # in flight when a RST_STREAM was sent.
+ # Also used to determine whether we should consider a frame received
+ # while a stream is closed as either a stream error or a connection
+ # error.
+ self._closed_streams = SizeLimitDict(
+ size_limit=self.MAX_CLOSED_STREAMS
+ )
+
+ # The flow control window manager for the connection.
+ self._inbound_flow_control_window_manager = WindowManager(
+ max_window_size=self.local_settings.initial_window_size
+ )
+
+ # When in doubt use dict-dispatch.
+ self._frame_dispatch_table = {
+ HeadersFrame: self._receive_headers_frame,
+ PushPromiseFrame: self._receive_push_promise_frame,
+ SettingsFrame: self._receive_settings_frame,
+ DataFrame: self._receive_data_frame,
+ WindowUpdateFrame: self._receive_window_update_frame,
+ PingFrame: self._receive_ping_frame,
+ RstStreamFrame: self._receive_rst_stream_frame,
+ PriorityFrame: self._receive_priority_frame,
+ GoAwayFrame: self._receive_goaway_frame,
+ ContinuationFrame: self._receive_naked_continuation,
+ AltSvcFrame: self._receive_alt_svc_frame,
+ ExtensionFrame: self._receive_unknown_frame
+ }
+
+ def _prepare_for_sending(self, frames):
+ if not frames:
+ return
+ self._data_to_send += b''.join(f.serialize() for f in frames)
+ assert all(f.body_len <= self.max_outbound_frame_size for f in frames)
+
+ def _open_streams(self, remainder):
+ """
+ A common method of counting number of open streams. Returns the number
+ of streams that are open *and* that have (stream ID % 2) == remainder.
+ While it iterates, also deletes any closed streams.
+ """
+ count = 0
+ to_delete = []
+
+ for stream_id, stream in self.streams.items():
+ if stream.open and (stream_id % 2 == remainder):
+ count += 1
+ elif stream.closed:
+ to_delete.append(stream_id)
+
+ for stream_id in to_delete:
+ stream = self.streams.pop(stream_id)
+ self._closed_streams[stream_id] = stream.closed_by
+
+ return count
+
+ @property
+ def open_outbound_streams(self):
+ """
+ The current number of open outbound streams.
+ """
+ outbound_numbers = int(self.config.client_side)
+ return self._open_streams(outbound_numbers)
+
+ @property
+ def open_inbound_streams(self):
+ """
+ The current number of open inbound streams.
+ """
+ inbound_numbers = int(not self.config.client_side)
+ return self._open_streams(inbound_numbers)
+
+ @property
+ def inbound_flow_control_window(self):
+ """
+ The size of the inbound flow control window for the connection. This is
+ rarely publicly useful: instead, use :meth:`remote_flow_control_window
+ `. This
+ shortcut is largely present to provide a shortcut to this data.
+ """
+ return self._inbound_flow_control_window_manager.current_window_size
+
+ def _begin_new_stream(self, stream_id, allowed_ids):
+ """
+ Initiate a new stream.
+
+ .. versionchanged:: 2.0.0
+ Removed this function from the public API.
+
+ :param stream_id: The ID of the stream to open.
+ :param allowed_ids: What kind of stream ID is allowed.
+ """
+ self.config.logger.debug(
+ "Attempting to initiate stream ID %d", stream_id
+ )
+ outbound = self._stream_id_is_outbound(stream_id)
+ highest_stream_id = (
+ self.highest_outbound_stream_id if outbound else
+ self.highest_inbound_stream_id
+ )
+
+ if stream_id <= highest_stream_id:
+ raise StreamIDTooLowError(stream_id, highest_stream_id)
+
+ if (stream_id % 2) != int(allowed_ids):
+ raise ProtocolError(
+ "Invalid stream ID for peer."
+ )
+
+ s = H2Stream(
+ stream_id,
+ config=self.config,
+ inbound_window_size=self.local_settings.initial_window_size,
+ outbound_window_size=self.remote_settings.initial_window_size
+ )
+ self.config.logger.debug("Stream ID %d created", stream_id)
+ s.max_inbound_frame_size = self.max_inbound_frame_size
+ s.max_outbound_frame_size = self.max_outbound_frame_size
+
+ self.streams[stream_id] = s
+ self.config.logger.debug("Current streams: %s", self.streams.keys())
+
+ if outbound:
+ self.highest_outbound_stream_id = stream_id
+ else:
+ self.highest_inbound_stream_id = stream_id
+
+ return s
+
+ def initiate_connection(self):
+ """
+ Provides any data that needs to be sent at the start of the connection.
+ Must be called for both clients and servers.
+ """
+ self.config.logger.debug("Initializing connection")
+ self.state_machine.process_input(ConnectionInputs.SEND_SETTINGS)
+ if self.config.client_side:
+ preamble = b'PRI * HTTP/2.0\r\n\r\nSM\r\n\r\n'
+ else:
+ preamble = b''
+
+ f = SettingsFrame(0)
+ for setting, value in self.local_settings.items():
+ f.settings[setting] = value
+ self.config.logger.debug(
+ "Send Settings frame: %s", self.local_settings
+ )
+
+ self._data_to_send += preamble + f.serialize()
+
+ def initiate_upgrade_connection(self, settings_header=None):
+ """
+ Call to initialise the connection object for use with an upgraded
+ HTTP/2 connection (i.e. a connection negotiated using the
+ ``Upgrade: h2c`` HTTP header).
+
+ This method differs from :meth:`initiate_connection
+ ` in several ways.
+ Firstly, it handles the additional SETTINGS frame that is sent in the
+ ``HTTP2-Settings`` header field. When called on a client connection,
+ this method will return a bytestring that the caller can put in the
+ ``HTTP2-Settings`` field they send on their initial request. When
+ called on a server connection, the user **must** provide the value they
+ received from the client in the ``HTTP2-Settings`` header field to the
+ ``settings_header`` argument, which will be used appropriately.
+
+ Additionally, this method sets up stream 1 in a half-closed state
+ appropriate for this side of the connection, to reflect the fact that
+ the request is already complete.
+
+ Finally, this method also prepares the appropriate preamble to be sent
+ after the upgrade.
+
+ .. versionadded:: 2.3.0
+
+ :param settings_header: (optional, server-only): The value of the
+ ``HTTP2-Settings`` header field received from the client.
+ :type settings_header: ``bytes``
+
+ :returns: For clients, a bytestring to put in the ``HTTP2-Settings``.
+ For servers, returns nothing.
+ :rtype: ``bytes`` or ``None``
+ """
+ self.config.logger.debug(
+ "Upgrade connection. Current settings: %s", self.local_settings
+ )
+
+ frame_data = None
+ # Begin by getting the preamble in place.
+ self.initiate_connection()
+
+ if self.config.client_side:
+ f = SettingsFrame(0)
+ for setting, value in self.local_settings.items():
+ f.settings[setting] = value
+
+ frame_data = f.serialize_body()
+ frame_data = base64.urlsafe_b64encode(frame_data)
+ elif settings_header:
+ # We have a settings header from the client. This needs to be
+ # applied, but we want to throw away the ACK. We do this by
+ # inserting the data into a Settings frame and then passing it to
+ # the state machine, but ignoring the return value.
+ settings_header = base64.urlsafe_b64decode(settings_header)
+ f = SettingsFrame(0)
+ f.parse_body(settings_header)
+ self._receive_settings_frame(f)
+
+ # Set up appropriate state. Stream 1 in a half-closed state:
+ # half-closed(local) for clients, half-closed(remote) for servers.
+ # Additionally, we need to set up the Connection state machine.
+ connection_input = (
+ ConnectionInputs.SEND_HEADERS if self.config.client_side
+ else ConnectionInputs.RECV_HEADERS
+ )
+ self.config.logger.debug("Process input %s", connection_input)
+ self.state_machine.process_input(connection_input)
+
+ # Set up stream 1.
+ self._begin_new_stream(stream_id=1, allowed_ids=AllowedStreamIDs.ODD)
+ self.streams[1].upgrade(self.config.client_side)
+ return frame_data
+
+ def _get_or_create_stream(self, stream_id, allowed_ids):
+ """
+ Gets a stream by its stream ID. Will create one if one does not already
+ exist. Use allowed_ids to circumvent the usual stream ID rules for
+ clients and servers.
+
+ .. versionchanged:: 2.0.0
+ Removed this function from the public API.
+ """
+ try:
+ return self.streams[stream_id]
+ except KeyError:
+ return self._begin_new_stream(stream_id, allowed_ids)
+
+ def _get_stream_by_id(self, stream_id):
+ """
+ Gets a stream by its stream ID. Raises NoSuchStreamError if the stream
+ ID does not correspond to a known stream and is higher than the current
+ maximum: raises if it is lower than the current maximum.
+
+ .. versionchanged:: 2.0.0
+ Removed this function from the public API.
+ """
+ try:
+ return self.streams[stream_id]
+ except KeyError:
+ outbound = self._stream_id_is_outbound(stream_id)
+ highest_stream_id = (
+ self.highest_outbound_stream_id if outbound else
+ self.highest_inbound_stream_id
+ )
+
+ if stream_id > highest_stream_id:
+ raise NoSuchStreamError(stream_id)
+ else:
+ raise StreamClosedError(stream_id)
+
+ def get_next_available_stream_id(self):
+ """
+ Returns an integer suitable for use as the stream ID for the next
+ stream created by this endpoint. For server endpoints, this stream ID
+ will be even. For client endpoints, this stream ID will be odd. If no
+ stream IDs are available, raises :class:`NoAvailableStreamIDError
+ `.
+
+ .. warning:: The return value from this function does not change until
+ the stream ID has actually been used by sending or pushing
+ headers on that stream. For that reason, it should be
+ called as close as possible to the actual use of the
+ stream ID.
+
+ .. versionadded:: 2.0.0
+
+ :raises: :class:`NoAvailableStreamIDError
+ `
+ :returns: The next free stream ID this peer can use to initiate a
+ stream.
+ :rtype: ``int``
+ """
+ # No streams have been opened yet, so return the lowest allowed stream
+ # ID.
+ if not self.highest_outbound_stream_id:
+ next_stream_id = 1 if self.config.client_side else 2
+ else:
+ next_stream_id = self.highest_outbound_stream_id + 2
+ self.config.logger.debug(
+ "Next available stream ID %d", next_stream_id
+ )
+ if next_stream_id > self.HIGHEST_ALLOWED_STREAM_ID:
+ raise NoAvailableStreamIDError("Exhausted allowed stream IDs")
+
+ return next_stream_id
+
+ def send_headers(self, stream_id, headers, end_stream=False,
+ priority_weight=None, priority_depends_on=None,
+ priority_exclusive=None):
+ """
+ Send headers on a given stream.
+
+ This function can be used to send request or response headers: the kind
+ that are sent depends on whether this connection has been opened as a
+ client or server connection, and whether the stream was opened by the
+ remote peer or not.
+
+ If this is a client connection, calling ``send_headers`` will send the
+ headers as a request. It will also implicitly open the stream being
+ used. If this is a client connection and ``send_headers`` has *already*
+ been called, this will send trailers instead.
+
+ If this is a server connection, calling ``send_headers`` will send the
+ headers as a response. It is a protocol error for a server to open a
+ stream by sending headers. If this is a server connection and
+ ``send_headers`` has *already* been called, this will send trailers
+ instead.
+
+ When acting as a server, you may call ``send_headers`` any number of
+ times allowed by the following rules, in this order:
+
+ - zero or more times with ``(':status', '1XX')`` (where ``1XX`` is a
+ placeholder for any 100-level status code).
+ - once with any other status header.
+ - zero or one time for trailers.
+
+ That is, you are allowed to send as many informational responses as you
+ like, followed by one complete response and zero or one HTTP trailer
+ blocks.
+
+ Clients may send one or two header blocks: one request block, and
+ optionally one trailer block.
+
+ If it is important to send HPACK "never indexed" header fields (as
+ defined in `RFC 7451 Section 7.1.3
+ `_), the user may
+ instead provide headers using the HPACK library's :class:`HeaderTuple
+ ` and :class:`NeverIndexedHeaderTuple
+ ` objects.
+
+ This method also allows users to prioritize the stream immediately,
+ by sending priority information on the HEADERS frame directly. To do
+ this, any one of ``priority_weight``, ``priority_depends_on``, or
+ ``priority_exclusive`` must be set to a value that is not ``None``. For
+ more information on the priority fields, see :meth:`prioritize
+ `.
+
+ .. warning:: In HTTP/2, it is mandatory that all the HTTP/2 special
+ headers (that is, ones whose header keys begin with ``:``) appear
+ at the start of the header block, before any normal headers.
+
+ .. versionchanged:: 2.3.0
+ Added support for using :class:`HeaderTuple
+ ` objects to store headers.
+
+ .. versionchanged:: 2.4.0
+ Added the ability to provide priority keyword arguments:
+ ``priority_weight``, ``priority_depends_on``, and
+ ``priority_exclusive``.
+
+ :param stream_id: The stream ID to send the headers on. If this stream
+ does not currently exist, it will be created.
+ :type stream_id: ``int``
+
+ :param headers: The request/response headers to send.
+ :type headers: An iterable of two tuples of bytestrings or
+ :class:`HeaderTuple ` objects.
+
+ :param end_stream: Whether this headers frame should end the stream
+ immediately (that is, whether no more data will be sent after this
+ frame). Defaults to ``False``.
+ :type end_stream: ``bool``
+
+ :param priority_weight: Sets the priority weight of the stream. See
+ :meth:`prioritize ` for more
+ about how this field works. Defaults to ``None``, which means that
+ no priority information will be sent.
+ :type priority_weight: ``int`` or ``None``
+
+ :param priority_depends_on: Sets which stream this one depends on for
+ priority purposes. See :meth:`prioritize
+ ` for more about how this
+ field works. Defaults to ``None``, which means that no priority
+ information will be sent.
+ :type priority_depends_on: ``int`` or ``None``
+
+ :param priority_exclusive: Sets whether this stream exclusively depends
+ on the stream given in ``priority_depends_on`` for priority
+ purposes. See :meth:`prioritize
+ ` for more about how this
+ field workds. Defaults to ``None``, which means that no priority
+ information will be sent.
+ :type priority_depends_on: ``bool`` or ``None``
+
+ :returns: Nothing
+ """
+ self.config.logger.debug(
+ "Send headers on stream ID %d", stream_id
+ )
+
+ # Check we can open the stream.
+ if stream_id not in self.streams:
+ max_open_streams = self.remote_settings.max_concurrent_streams
+ if (self.open_outbound_streams + 1) > max_open_streams:
+ raise TooManyStreamsError(
+ "Max outbound streams is %d, %d open" %
+ (max_open_streams, self.open_outbound_streams)
+ )
+
+ self.state_machine.process_input(ConnectionInputs.SEND_HEADERS)
+ stream = self._get_or_create_stream(
+ stream_id, AllowedStreamIDs(self.config.client_side)
+ )
+ frames = stream.send_headers(
+ headers, self.encoder, end_stream
+ )
+
+ # We may need to send priority information.
+ priority_present = (
+ (priority_weight is not None) or
+ (priority_depends_on is not None) or
+ (priority_exclusive is not None)
+ )
+
+ if priority_present:
+ if not self.config.client_side:
+ raise RFC1122Error("Servers SHOULD NOT prioritize streams.")
+
+ headers_frame = frames[0]
+ headers_frame.flags.add('PRIORITY')
+ frames[0] = _add_frame_priority(
+ headers_frame,
+ priority_weight,
+ priority_depends_on,
+ priority_exclusive
+ )
+
+ self._prepare_for_sending(frames)
+
+ def send_data(self, stream_id, data, end_stream=False, pad_length=None):
+ """
+ Send data on a given stream.
+
+ This method does no breaking up of data: if the data is larger than the
+ value returned by :meth:`local_flow_control_window
+ ` for this stream
+ then a :class:`FlowControlError ` will
+ be raised. If the data is larger than :data:`max_outbound_frame_size
+ ` then a
+ :class:`FrameTooLargeError ` will be
+ raised.
+
+ Hyper-h2 does this to avoid buffering the data internally. If the user
+ has more data to send than hyper-h2 will allow, consider breaking it up
+ and buffering it externally.
+
+ :param stream_id: The ID of the stream on which to send the data.
+ :type stream_id: ``int``
+ :param data: The data to send on the stream.
+ :type data: ``bytes``
+ :param end_stream: (optional) Whether this is the last data to be sent
+ on the stream. Defaults to ``False``.
+ :type end_stream: ``bool``
+ :param pad_length: (optional) Length of the padding to apply to the
+ data frame. Defaults to ``None`` for no use of padding. Note that
+ a value of ``0`` results in padding of length ``0``
+ (with the "padding" flag set on the frame).
+
+ .. versionadded:: 2.6.0
+
+ :type pad_length: ``int``
+ :returns: Nothing
+ """
+ self.config.logger.debug(
+ "Send data on stream ID %d with len %d", stream_id, len(data)
+ )
+ frame_size = len(data)
+ if pad_length is not None:
+ if not isinstance(pad_length, int):
+ raise TypeError("pad_length must be an int")
+ if pad_length < 0 or pad_length > 255:
+ raise ValueError("pad_length must be within range: [0, 255]")
+ # Account for padding bytes plus the 1-byte padding length field.
+ frame_size += pad_length + 1
+ self.config.logger.debug(
+ "Frame size on stream ID %d is %d", stream_id, frame_size
+ )
+
+ if frame_size > self.local_flow_control_window(stream_id):
+ raise FlowControlError(
+ "Cannot send %d bytes, flow control window is %d." %
+ (frame_size, self.local_flow_control_window(stream_id))
+ )
+ elif frame_size > self.max_outbound_frame_size:
+ raise FrameTooLargeError(
+ "Cannot send frame size %d, max frame size is %d" %
+ (frame_size, self.max_outbound_frame_size)
+ )
+
+ self.state_machine.process_input(ConnectionInputs.SEND_DATA)
+ frames = self.streams[stream_id].send_data(
+ data, end_stream, pad_length=pad_length
+ )
+
+ self._prepare_for_sending(frames)
+
+ self.outbound_flow_control_window -= frame_size
+ self.config.logger.debug(
+ "Outbound flow control window size is %d",
+ self.outbound_flow_control_window
+ )
+ assert self.outbound_flow_control_window >= 0
+
+ def end_stream(self, stream_id):
+ """
+ Cleanly end a given stream.
+
+ This method ends a stream by sending an empty DATA frame on that stream
+ with the ``END_STREAM`` flag set.
+
+ :param stream_id: The ID of the stream to end.
+ :type stream_id: ``int``
+ :returns: Nothing
+ """
+ self.config.logger.debug("End stream ID %d", stream_id)
+ self.state_machine.process_input(ConnectionInputs.SEND_DATA)
+ frames = self.streams[stream_id].end_stream()
+ self._prepare_for_sending(frames)
+
+ def increment_flow_control_window(self, increment, stream_id=None):
+ """
+ Increment a flow control window, optionally for a single stream. Allows
+ the remote peer to send more data.
+
+ .. versionchanged:: 2.0.0
+ Rejects attempts to increment the flow control window by out of
+ range values with a ``ValueError``.
+
+ :param increment: The amount to increment the flow control window by.
+ :type increment: ``int``
+ :param stream_id: (optional) The ID of the stream that should have its
+ flow control window opened. If not present or ``None``, the
+ connection flow control window will be opened instead.
+ :type stream_id: ``int`` or ``None``
+ :returns: Nothing
+ :raises: ``ValueError``
+ """
+ if not (1 <= increment <= self.MAX_WINDOW_INCREMENT):
+ raise ValueError(
+ "Flow control increment must be between 1 and %d" %
+ self.MAX_WINDOW_INCREMENT
+ )
+
+ self.state_machine.process_input(ConnectionInputs.SEND_WINDOW_UPDATE)
+
+ if stream_id is not None:
+ stream = self.streams[stream_id]
+ frames = stream.increase_flow_control_window(
+ increment
+ )
+
+ self.config.logger.debug(
+ "Increase stream ID %d flow control window by %d",
+ stream_id, increment
+ )
+ else:
+ self._inbound_flow_control_window_manager.window_opened(increment)
+ f = WindowUpdateFrame(0)
+ f.window_increment = increment
+ frames = [f]
+
+ self.config.logger.debug(
+ "Increase connection flow control window by %d", increment
+ )
+
+ self._prepare_for_sending(frames)
+
+ def push_stream(self, stream_id, promised_stream_id, request_headers):
+ """
+ Push a response to the client by sending a PUSH_PROMISE frame.
+
+ If it is important to send HPACK "never indexed" header fields (as
+ defined in `RFC 7451 Section 7.1.3
+ `_), the user may
+ instead provide headers using the HPACK library's :class:`HeaderTuple
+ ` and :class:`NeverIndexedHeaderTuple
+ ` objects.
+
+ :param stream_id: The ID of the stream that this push is a response to.
+ :type stream_id: ``int``
+ :param promised_stream_id: The ID of the stream that the pushed
+ response will be sent on.
+ :type promised_stream_id: ``int``
+ :param request_headers: The headers of the request that the pushed
+ response will be responding to.
+ :type request_headers: An iterable of two tuples of bytestrings or
+ :class:`HeaderTuple ` objects.
+ :returns: Nothing
+ """
+ self.config.logger.debug(
+ "Send Push Promise frame on stream ID %d", stream_id
+ )
+
+ if not self.remote_settings.enable_push:
+ raise ProtocolError("Remote peer has disabled stream push")
+
+ self.state_machine.process_input(ConnectionInputs.SEND_PUSH_PROMISE)
+ stream = self._get_stream_by_id(stream_id)
+
+ # We need to prevent users pushing streams in response to streams that
+ # they themselves have already pushed: see #163 and RFC 7540 § 6.6. The
+ # easiest way to do that is to assert that the stream_id is not even:
+ # this shortcut works because only servers can push and the state
+ # machine will enforce this.
+ if (stream_id % 2) == 0:
+ raise ProtocolError("Cannot recursively push streams.")
+
+ new_stream = self._begin_new_stream(
+ promised_stream_id, AllowedStreamIDs.EVEN
+ )
+ self.streams[promised_stream_id] = new_stream
+
+ frames = stream.push_stream_in_band(
+ promised_stream_id, request_headers, self.encoder
+ )
+ new_frames = new_stream.locally_pushed()
+ self._prepare_for_sending(frames + new_frames)
+
+ def ping(self, opaque_data):
+ """
+ Send a PING frame.
+
+ :param opaque_data: A bytestring of length 8 that will be sent in the
+ PING frame.
+ :returns: Nothing
+ """
+ self.config.logger.debug("Send Ping frame")
+
+ if not isinstance(opaque_data, bytes) or len(opaque_data) != 8:
+ raise ValueError("Invalid value for ping data: %r" % opaque_data)
+
+ self.state_machine.process_input(ConnectionInputs.SEND_PING)
+ f = PingFrame(0)
+ f.opaque_data = opaque_data
+ self._prepare_for_sending([f])
+
+ def reset_stream(self, stream_id, error_code=0):
+ """
+ Reset a stream.
+
+ This method forcibly closes a stream by sending a RST_STREAM frame for
+ a given stream. This is not a graceful closure. To gracefully end a
+ stream, try the :meth:`end_stream
+ ` method.
+
+ :param stream_id: The ID of the stream to reset.
+ :type stream_id: ``int``
+ :param error_code: (optional) The error code to use to reset the
+ stream. Defaults to :data:`ErrorCodes.NO_ERROR
+ `.
+ :type error_code: ``int``
+ :returns: Nothing
+ """
+ self.config.logger.debug("Reset stream ID %d", stream_id)
+ self.state_machine.process_input(ConnectionInputs.SEND_RST_STREAM)
+ stream = self._get_stream_by_id(stream_id)
+ frames = stream.reset_stream(error_code)
+ self._prepare_for_sending(frames)
+
+ def close_connection(self, error_code=0, additional_data=None,
+ last_stream_id=None):
+
+ """
+ Close a connection, emitting a GOAWAY frame.
+
+ .. versionchanged:: 2.4.0
+ Added ``additional_data`` and ``last_stream_id`` arguments.
+
+ :param error_code: (optional) The error code to send in the GOAWAY
+ frame.
+ :param additional_data: (optional) Additional debug data indicating
+ a reason for closing the connection. Must be a bytestring.
+ :param last_stream_id: (optional) The last stream which was processed
+ by the sender. Defaults to ``highest_inbound_stream_id``.
+ :returns: Nothing
+ """
+ self.config.logger.debug("Close connection")
+ self.state_machine.process_input(ConnectionInputs.SEND_GOAWAY)
+
+ # Additional_data must be bytes
+ if additional_data is not None:
+ assert isinstance(additional_data, bytes)
+
+ if last_stream_id is None:
+ last_stream_id = self.highest_inbound_stream_id
+
+ f = GoAwayFrame(
+ stream_id=0,
+ last_stream_id=last_stream_id,
+ error_code=error_code,
+ additional_data=(additional_data or b'')
+ )
+ self._prepare_for_sending([f])
+
+ def update_settings(self, new_settings):
+ """
+ Update the local settings. This will prepare and emit the appropriate
+ SETTINGS frame.
+
+ :param new_settings: A dictionary of {setting: new value}
+ """
+ self.config.logger.debug(
+ "Update connection settings to %s", new_settings
+ )
+ self.state_machine.process_input(ConnectionInputs.SEND_SETTINGS)
+ self.local_settings.update(new_settings)
+ s = SettingsFrame(0)
+ s.settings = new_settings
+ self._prepare_for_sending([s])
+
+ def advertise_alternative_service(self,
+ field_value,
+ origin=None,
+ stream_id=None):
+ """
+ Notify a client about an available Alternative Service.
+
+ An Alternative Service is defined in `RFC 7838
+ `_. An Alternative Service
+ notification informs a client that a given origin is also available
+ elsewhere.
+
+ Alternative Services can be advertised in two ways. Firstly, they can
+ be advertised explicitly: that is, a server can say "origin X is also
+ available at Y". To advertise like this, set the ``origin`` argument
+ and not the ``stream_id`` argument. Alternatively, they can be
+ advertised implicitly: that is, a server can say "the origin you're
+ contacting on stream X is also available at Y". To advertise like this,
+ set the ``stream_id`` argument and not the ``origin`` argument.
+
+ The explicit method of advertising can be done as long as the
+ connection is active. The implicit method can only be done after the
+ client has sent the request headers and before the server has sent the
+ response headers: outside of those points, Hyper-h2 will forbid sending
+ the Alternative Service advertisement by raising a ProtocolError.
+
+ The ``field_value`` parameter is specified in RFC 7838. Hyper-h2 does
+ not validate or introspect this argument: the user is required to
+ ensure that it's well-formed. ``field_value`` corresponds to RFC 7838's
+ "Alternative Service Field Value".
+
+ .. note:: It is strongly preferred to use the explicit method of
+ advertising Alternative Services. The implicit method of
+ advertising Alternative Services has a number of subtleties
+ and can lead to inconsistencies between the server and
+ client. Hyper-h2 allows both mechanisms, but caution is
+ strongly advised.
+
+ .. versionadded:: 2.3.0
+
+ :param field_value: The RFC 7838 Alternative Service Field Value. This
+ argument is not introspected by Hyper-h2: the user is responsible
+ for ensuring that it is well-formed.
+ :type field_value: ``bytes``
+
+ :param origin: The origin/authority to which the Alternative Service
+ being advertised applies. Must not be provided at the same time as
+ ``stream_id``.
+ :type origin: ``bytes`` or ``None``
+
+ :param stream_id: The ID of the stream which was sent to the authority
+ for which this Alternative Service advertisement applies. Must not
+ be provided at the same time as ``origin``.
+ :type stream_id: ``int`` or ``None``
+
+ :returns: Nothing.
+ """
+ if not isinstance(field_value, bytes):
+ raise ValueError("Field must be bytestring.")
+
+ if origin is not None and stream_id is not None:
+ raise ValueError("Must not provide both origin and stream_id")
+
+ self.state_machine.process_input(
+ ConnectionInputs.SEND_ALTERNATIVE_SERVICE
+ )
+
+ if origin is not None:
+ # This ALTSVC is sent on stream zero.
+ f = AltSvcFrame(stream_id=0)
+ f.origin = origin
+ f.field = field_value
+ frames = [f]
+ else:
+ stream = self._get_stream_by_id(stream_id)
+ frames = stream.advertise_alternative_service(field_value)
+
+ self._prepare_for_sending(frames)
+
+ def prioritize(self, stream_id, weight=None, depends_on=None,
+ exclusive=None):
+ """
+ Notify a server about the priority of a stream.
+
+ Stream priorities are a form of guidance to a remote server: they
+ inform the server about how important a given response is, so that the
+ server may allocate its resources (e.g. bandwidth, CPU time, etc.)
+ accordingly. This exists to allow clients to ensure that the most
+ important data arrives earlier, while less important data does not
+ starve out the more important data.
+
+ Stream priorities are explained in depth in `RFC 7540 Section 5.3
+ `_.
+
+ This method updates the priority information of a single stream. It may
+ be called well before a stream is actively in use, or well after a
+ stream is closed.
+
+ .. warning:: RFC 7540 allows for servers to change the priority of
+ streams. However, hyper-h2 **does not** allow server
+ stacks to do this. This is because most clients do not
+ adequately know how to respond when provided conflicting
+ priority information, and relatively little utility is
+ provided by making that functionality available.
+
+ .. note:: hyper-h2 **does not** maintain any information about the
+ RFC 7540 priority tree. That means that hyper-h2 does not
+ prevent incautious users from creating invalid priority
+ trees, particularly by creating priority loops. While some
+ basic error checking is provided by hyper-h2, users are
+ strongly recommended to understand their prioritisation
+ strategies before using the priority tools here.
+
+ .. note:: Priority information is strictly advisory. Servers are
+ allowed to disregard it entirely. Avoid relying on the idea
+ that your priority signaling will definitely be obeyed.
+
+ .. versionadded:: 2.4.0
+
+ :param stream_id: The ID of the stream to prioritize.
+ :type stream_id: ``int``
+
+ :param weight: The weight to give the stream. Defaults to ``16``, the
+ default weight of any stream. May be any value between ``1`` and
+ ``256`` inclusive. The relative weight of a stream indicates what
+ proportion of available resources will be allocated to that
+ stream.
+ :type weight: ``int``
+
+ :param depends_on: The ID of the stream on which this stream depends.
+ This stream will only be progressed if it is impossible to
+ progress the parent stream (the one on which this one depends).
+ Passing the value ``0`` means that this stream does not depend on
+ any other. Defaults to ``0``.
+ :type depends_on: ``int``
+
+ :param exclusive: Whether this stream is an exclusive dependency of its
+ "parent" stream (i.e. the stream given by ``depends_on``). If a
+ stream is an exclusive dependency of another, that means that all
+ previously-set children of the parent are moved to become children
+ of the new exclusively-dependent stream. Defaults to ``False``.
+ :type exclusive: ``bool``
+ """
+ if not self.config.client_side:
+ raise RFC1122Error("Servers SHOULD NOT prioritize streams.")
+
+ self.state_machine.process_input(
+ ConnectionInputs.SEND_PRIORITY
+ )
+
+ frame = PriorityFrame(stream_id)
+ frame = _add_frame_priority(frame, weight, depends_on, exclusive)
+
+ self._prepare_for_sending([frame])
+
+ def local_flow_control_window(self, stream_id):
+ """
+ Returns the maximum amount of data that can be sent on stream
+ ``stream_id``.
+
+ This value will never be larger than the total data that can be sent on
+ the connection: even if the given stream allows more data, the
+ connection window provides a logical maximum to the amount of data that
+ can be sent.
+
+ The maximum data that can be sent in a single data frame on a stream
+ is either this value, or the maximum frame size, whichever is
+ *smaller*.
+
+ :param stream_id: The ID of the stream whose flow control window is
+ being queried.
+ :type stream_id: ``int``
+ :returns: The amount of data in bytes that can be sent on the stream
+ before the flow control window is exhausted.
+ :rtype: ``int``
+ """
+ stream = self._get_stream_by_id(stream_id)
+ return min(
+ self.outbound_flow_control_window,
+ stream.outbound_flow_control_window
+ )
+
+ def remote_flow_control_window(self, stream_id):
+ """
+ Returns the maximum amount of data the remote peer can send on stream
+ ``stream_id``.
+
+ This value will never be larger than the total data that can be sent on
+ the connection: even if the given stream allows more data, the
+ connection window provides a logical maximum to the amount of data that
+ can be sent.
+
+ The maximum data that can be sent in a single data frame on a stream
+ is either this value, or the maximum frame size, whichever is
+ *smaller*.
+
+ :param stream_id: The ID of the stream whose flow control window is
+ being queried.
+ :type stream_id: ``int``
+ :returns: The amount of data in bytes that can be received on the
+ stream before the flow control window is exhausted.
+ :rtype: ``int``
+ """
+ stream = self._get_stream_by_id(stream_id)
+ return min(
+ self.inbound_flow_control_window,
+ stream.inbound_flow_control_window
+ )
+
+ def acknowledge_received_data(self, acknowledged_size, stream_id):
+ """
+ Inform the :class:`H2Connection ` that a
+ certain number of flow-controlled bytes have been processed, and that
+ the space should be handed back to the remote peer at an opportune
+ time.
+
+ .. versionadded:: 2.5.0
+
+ :param acknowledged_size: The total *flow-controlled size* of the data
+ that has been processed. Note that this must include the amount of
+ padding that was sent with that data.
+ :type acknowledged_size: ``int``
+ :param stream_id: The ID of the stream on which this data was received.
+ :type stream_id: ``int``
+ :returns: Nothing
+ :rtype: ``None``
+ """
+ self.config.logger.debug(
+ "Ack received data on stream ID %d with size %d",
+ stream_id, acknowledged_size
+ )
+ if stream_id <= 0:
+ raise ValueError(
+ "Stream ID %d is not valid for acknowledge_received_data" %
+ stream_id
+ )
+ if acknowledged_size < 0:
+ raise ValueError("Cannot acknowledge negative data")
+
+ frames = []
+
+ conn_manager = self._inbound_flow_control_window_manager
+ conn_increment = conn_manager.process_bytes(acknowledged_size)
+ if conn_increment:
+ f = WindowUpdateFrame(0)
+ f.window_increment = conn_increment
+ frames.append(f)
+
+ try:
+ stream = self._get_stream_by_id(stream_id)
+ except StreamClosedError:
+ # The stream is already gone. We're not worried about incrementing
+ # the window in this case.
+ pass
+ else:
+ # No point incrementing the windows of closed streams.
+ if stream.open:
+ frames.extend(
+ stream.acknowledge_received_data(acknowledged_size)
+ )
+
+ self._prepare_for_sending(frames)
+
+ def data_to_send(self, amount=None):
+ """
+ Returns some data for sending out of the internal data buffer.
+
+ This method is analogous to ``read`` on a file-like object, but it
+ doesn't block. Instead, it returns as much data as the user asks for,
+ or less if that much data is not available. It does not perform any
+ I/O, and so uses a different name.
+
+ :param amount: (optional) The maximum amount of data to return. If not
+ set, or set to ``None``, will return as much data as possible.
+ :type amount: ``int``
+ :returns: A bytestring containing the data to send on the wire.
+ :rtype: ``bytes``
+ """
+ if amount is None:
+ data = bytes(self._data_to_send)
+ self._data_to_send = bytearray()
+ return data
+ else:
+ data = bytes(self._data_to_send[:amount])
+ self._data_to_send = self._data_to_send[amount:]
+ return data
+
+ def clear_outbound_data_buffer(self):
+ """
+ Clears the outbound data buffer, such that if this call was immediately
+ followed by a call to
+ :meth:`data_to_send `, that
+ call would return no data.
+
+ This method should not normally be used, but is made available to avoid
+ exposing implementation details.
+ """
+ self._data_to_send = bytearray()
+
+ def _acknowledge_settings(self):
+ """
+ Acknowledge settings that have been received.
+
+ .. versionchanged:: 2.0.0
+ Removed from public API, removed useless ``event`` parameter, made
+ automatic.
+
+ :returns: Nothing
+ """
+ self.state_machine.process_input(ConnectionInputs.SEND_SETTINGS)
+
+ changes = self.remote_settings.acknowledge()
+
+ if SettingCodes.INITIAL_WINDOW_SIZE in changes:
+ setting = changes[SettingCodes.INITIAL_WINDOW_SIZE]
+ self._flow_control_change_from_settings(
+ setting.original_value,
+ setting.new_value,
+ )
+
+ # HEADER_TABLE_SIZE changes by the remote part affect our encoder: cf.
+ # RFC 7540 Section 6.5.2.
+ if SettingCodes.HEADER_TABLE_SIZE in changes:
+ setting = changes[SettingCodes.HEADER_TABLE_SIZE]
+ self.encoder.header_table_size = setting.new_value
+
+ if SettingCodes.MAX_FRAME_SIZE in changes:
+ setting = changes[SettingCodes.MAX_FRAME_SIZE]
+ self.max_outbound_frame_size = setting.new_value
+ for stream in self.streams.values():
+ stream.max_outbound_frame_size = setting.new_value
+
+ f = SettingsFrame(0)
+ f.flags.add('ACK')
+ return [f]
+
+ def _flow_control_change_from_settings(self, old_value, new_value):
+ """
+ Update flow control windows in response to a change in the value of
+ SETTINGS_INITIAL_WINDOW_SIZE.
+
+ When this setting is changed, it automatically updates all flow control
+ windows by the delta in the settings values. Note that it does not
+ increment the *connection* flow control window, per section 6.9.2 of
+ RFC 7540.
+ """
+ delta = new_value - old_value
+
+ for stream in self.streams.values():
+ stream.outbound_flow_control_window = guard_increment_window(
+ stream.outbound_flow_control_window,
+ delta
+ )
+
+ def _inbound_flow_control_change_from_settings(self, old_value, new_value):
+ """
+ Update remote flow control windows in response to a change in the value
+ of SETTINGS_INITIAL_WINDOW_SIZE.
+
+ When this setting is changed, it automatically updates all remote flow
+ control windows by the delta in the settings values.
+ """
+ delta = new_value - old_value
+
+ for stream in self.streams.values():
+ stream._inbound_flow_control_change_from_settings(delta)
+
+ def receive_data(self, data):
+ """
+ Pass some received HTTP/2 data to the connection for handling.
+
+ :param data: The data received from the remote peer on the network.
+ :type data: ``bytes``
+ :returns: A list of events that the remote peer triggered by sending
+ this data.
+ """
+ self.config.logger.trace(
+ "Process received data on connection. Received data: %r", data
+ )
+
+ events = []
+ self.incoming_buffer.add_data(data)
+ self.incoming_buffer.max_frame_size = self.max_inbound_frame_size
+
+ try:
+ for frame in self.incoming_buffer:
+ events.extend(self._receive_frame(frame))
+ except InvalidPaddingError:
+ self._terminate_connection(ErrorCodes.PROTOCOL_ERROR)
+ raise ProtocolError("Received frame with invalid padding.")
+ except ProtocolError as e:
+ # For whatever reason, receiving the frame caused a protocol error.
+ # We should prepare to emit a GoAway frame before throwing the
+ # exception up further. No need for an event: the exception will
+ # do fine.
+ self._terminate_connection(e.error_code)
+ raise
+
+ return events
+
+ def _receive_frame(self, frame):
+ """
+ Handle a frame received on the connection.
+
+ .. versionchanged:: 2.0.0
+ Removed from the public API.
+ """
+ try:
+ # I don't love using __class__ here, maybe reconsider it.
+ frames, events = self._frame_dispatch_table[frame.__class__](frame)
+ except StreamClosedError as e:
+ # If the stream was closed by RST_STREAM, we just send a RST_STREAM
+ # to the remote peer. Otherwise, this is a connection error, and so
+ # we will re-raise to trigger one.
+ if self._stream_is_closed_by_reset(e.stream_id):
+ f = RstStreamFrame(e.stream_id)
+ f.error_code = e.error_code
+ self._prepare_for_sending([f])
+ events = e._events
+ else:
+ raise
+ except StreamIDTooLowError as e:
+ # The stream ID seems invalid. This may happen when the closed
+ # stream has been cleaned up, or when the remote peer has opened a
+ # new stream with a higher stream ID than this one, forcing it
+ # closed implicitly.
+ #
+ # Check how the stream was closed: depending on the mechanism, it
+ # is either a stream error or a connection error.
+ if self._stream_is_closed_by_reset(e.stream_id):
+ # Closed by RST_STREAM is a stream error.
+ f = RstStreamFrame(e.stream_id)
+ f.error_code = ErrorCodes.STREAM_CLOSED
+ self._prepare_for_sending([f])
+ events = []
+ elif self._stream_is_closed_by_end(e.stream_id):
+ # Closed by END_STREAM is a connection error.
+ raise StreamClosedError(e.stream_id)
+ else:
+ # Closed implicitly, also a connection error, but of type
+ # PROTOCOL_ERROR.
+ raise
+ else:
+ self._prepare_for_sending(frames)
+
+ return events
+
+ def _terminate_connection(self, error_code):
+ """
+ Terminate the connection early. Used in error handling blocks to send
+ GOAWAY frames.
+ """
+ f = GoAwayFrame(0)
+ f.last_stream_id = self.highest_inbound_stream_id
+ f.error_code = error_code
+ self.state_machine.process_input(ConnectionInputs.SEND_GOAWAY)
+ self._prepare_for_sending([f])
+
+ def _receive_headers_frame(self, frame):
+ """
+ Receive a headers frame on the connection.
+ """
+ # If necessary, check we can open the stream. Also validate that the
+ # stream ID is valid.
+ if frame.stream_id not in self.streams:
+ max_open_streams = self.local_settings.max_concurrent_streams
+ if (self.open_inbound_streams + 1) > max_open_streams:
+ raise TooManyStreamsError(
+ "Max outbound streams is %d, %d open" %
+ (max_open_streams, self.open_outbound_streams)
+ )
+
+ # Let's decode the headers. We handle headers as bytes internally up
+ # until we hang them off the event, at which point we may optionally
+ # convert them to unicode.
+ headers = _decode_headers(self.decoder, frame.data)
+
+ events = self.state_machine.process_input(
+ ConnectionInputs.RECV_HEADERS
+ )
+ stream = self._get_or_create_stream(
+ frame.stream_id, AllowedStreamIDs(not self.config.client_side)
+ )
+ frames, stream_events = stream.receive_headers(
+ headers,
+ 'END_STREAM' in frame.flags,
+ self.config.header_encoding
+ )
+
+ if 'PRIORITY' in frame.flags:
+ p_frames, p_events = self._receive_priority_frame(frame)
+ stream_events[0].priority_updated = p_events[0]
+ stream_events.extend(p_events)
+ assert not p_frames
+
+ return frames, events + stream_events
+
+ def _receive_push_promise_frame(self, frame):
+ """
+ Receive a push-promise frame on the connection.
+ """
+ if not self.local_settings.enable_push:
+ raise ProtocolError("Received pushed stream")
+
+ pushed_headers = _decode_headers(self.decoder, frame.data)
+
+ events = self.state_machine.process_input(
+ ConnectionInputs.RECV_PUSH_PROMISE
+ )
+
+ try:
+ stream = self._get_stream_by_id(frame.stream_id)
+ except NoSuchStreamError:
+ # We need to check if the parent stream was reset by us. If it was
+ # then we presume that the PUSH_PROMISE was in flight when we reset
+ # the parent stream. Rather than accept the new stream, just reset
+ # it.
+ #
+ # If this was closed naturally, however, we should call this a
+ # PROTOCOL_ERROR: pushing a stream on a naturally closed stream is
+ # a real problem because it creates a brand new stream that the
+ # remote peer now believes exists.
+ if (self._stream_closed_by(frame.stream_id) ==
+ StreamClosedBy.SEND_RST_STREAM):
+ f = RstStreamFrame(frame.promised_stream_id)
+ f.error_code = ErrorCodes.REFUSED_STREAM
+ return [f], events
+
+ raise ProtocolError("Attempted to push on closed stream.")
+
+ # We need to prevent peers pushing streams in response to streams that
+ # they themselves have already pushed: see #163 and RFC 7540 § 6.6. The
+ # easiest way to do that is to assert that the stream_id is not even:
+ # this shortcut works because only servers can push and the state
+ # machine will enforce this.
+ if (frame.stream_id % 2) == 0:
+ raise ProtocolError("Cannot recursively push streams.")
+
+ try:
+ frames, stream_events = stream.receive_push_promise_in_band(
+ frame.promised_stream_id,
+ pushed_headers,
+ self.config.header_encoding,
+ )
+ except StreamClosedError:
+ # The parent stream was reset by us, so we presume that
+ # PUSH_PROMISE was in flight when we reset the parent stream.
+ # So we just reset the new stream.
+ f = RstStreamFrame(frame.promised_stream_id)
+ f.error_code = ErrorCodes.REFUSED_STREAM
+ return [f], events
+
+ new_stream = self._begin_new_stream(
+ frame.promised_stream_id, AllowedStreamIDs.EVEN
+ )
+ self.streams[frame.promised_stream_id] = new_stream
+ new_stream.remotely_pushed(pushed_headers)
+
+ return frames, events + stream_events
+
+ def _handle_data_on_closed_stream(self, events, exc, frame):
+ # This stream is already closed - and yet we received a DATA frame.
+ # The received DATA frame counts towards the connection flow window.
+ # We need to manually to acknowledge the DATA frame to update the flow
+ # window of the connection. Otherwise the whole connection stalls due
+ # the inbound flow window being 0.
+ frames = []
+ conn_manager = self._inbound_flow_control_window_manager
+ conn_increment = conn_manager.process_bytes(
+ frame.flow_controlled_length
+ )
+ if conn_increment:
+ f = WindowUpdateFrame(0)
+ f.window_increment = conn_increment
+ frames.append(f)
+ self.config.logger.debug(
+ "Received DATA frame on closed stream %d - "
+ "auto-emitted a WINDOW_UPDATE by %d",
+ frame.stream_id, conn_increment
+ )
+ f = RstStreamFrame(exc.stream_id)
+ f.error_code = exc.error_code
+ frames.append(f)
+ self.config.logger.debug(
+ "Stream %d already CLOSED or cleaned up - "
+ "auto-emitted a RST_FRAME" % frame.stream_id
+ )
+ return frames, events + exc._events
+
+ def _receive_data_frame(self, frame):
+ """
+ Receive a data frame on the connection.
+ """
+ flow_controlled_length = frame.flow_controlled_length
+
+ events = self.state_machine.process_input(
+ ConnectionInputs.RECV_DATA
+ )
+ self._inbound_flow_control_window_manager.window_consumed(
+ flow_controlled_length
+ )
+
+ try:
+ stream = self._get_stream_by_id(frame.stream_id)
+ frames, stream_events = stream.receive_data(
+ frame.data,
+ 'END_STREAM' in frame.flags,
+ flow_controlled_length
+ )
+ except StreamClosedError as e:
+ # This stream is either marked as CLOSED or already gone from our
+ # internal state.
+ return self._handle_data_on_closed_stream(events, e, frame)
+
+ return frames, events + stream_events
+
+ def _receive_settings_frame(self, frame):
+ """
+ Receive a SETTINGS frame on the connection.
+ """
+ events = self.state_machine.process_input(
+ ConnectionInputs.RECV_SETTINGS
+ )
+
+ # This is an ack of the local settings.
+ if 'ACK' in frame.flags:
+ changed_settings = self._local_settings_acked()
+ ack_event = SettingsAcknowledged()
+ ack_event.changed_settings = changed_settings
+ events.append(ack_event)
+ return [], events
+
+ # Add the new settings.
+ self.remote_settings.update(frame.settings)
+ events.append(
+ RemoteSettingsChanged.from_settings(
+ self.remote_settings, frame.settings
+ )
+ )
+ frames = self._acknowledge_settings()
+
+ return frames, events
+
+ def _receive_window_update_frame(self, frame):
+ """
+ Receive a WINDOW_UPDATE frame on the connection.
+ """
+ # hyperframe will take care of validating the window_increment.
+ # If we reach in here, we can assume a valid value.
+
+ events = self.state_machine.process_input(
+ ConnectionInputs.RECV_WINDOW_UPDATE
+ )
+
+ if frame.stream_id:
+ try:
+ stream = self._get_stream_by_id(frame.stream_id)
+ frames, stream_events = stream.receive_window_update(
+ frame.window_increment
+ )
+ except StreamClosedError:
+ return [], events
+ else:
+ # Increment our local flow control window.
+ self.outbound_flow_control_window = guard_increment_window(
+ self.outbound_flow_control_window,
+ frame.window_increment
+ )
+
+ # FIXME: Should we split this into one event per active stream?
+ window_updated_event = WindowUpdated()
+ window_updated_event.stream_id = 0
+ window_updated_event.delta = frame.window_increment
+ stream_events = [window_updated_event]
+ frames = []
+
+ return frames, events + stream_events
+
+ def _receive_ping_frame(self, frame):
+ """
+ Receive a PING frame on the connection.
+ """
+ events = self.state_machine.process_input(
+ ConnectionInputs.RECV_PING
+ )
+ flags = []
+
+ if 'ACK' in frame.flags:
+ evt = PingAckReceived()
+ else:
+ evt = PingReceived()
+
+ # automatically ACK the PING with the same 'opaque data'
+ f = PingFrame(0)
+ f.flags = {'ACK'}
+ f.opaque_data = frame.opaque_data
+ flags.append(f)
+
+ evt.ping_data = frame.opaque_data
+ events.append(evt)
+
+ return flags, events
+
+ def _receive_rst_stream_frame(self, frame):
+ """
+ Receive a RST_STREAM frame on the connection.
+ """
+ events = self.state_machine.process_input(
+ ConnectionInputs.RECV_RST_STREAM
+ )
+ try:
+ stream = self._get_stream_by_id(frame.stream_id)
+ except NoSuchStreamError:
+ # The stream is missing. That's ok, we just do nothing here.
+ stream_frames = []
+ stream_events = []
+ else:
+ stream_frames, stream_events = stream.stream_reset(frame)
+
+ return stream_frames, events + stream_events
+
+ def _receive_priority_frame(self, frame):
+ """
+ Receive a PRIORITY frame on the connection.
+ """
+ events = self.state_machine.process_input(
+ ConnectionInputs.RECV_PRIORITY
+ )
+
+ event = PriorityUpdated()
+ event.stream_id = frame.stream_id
+ event.depends_on = frame.depends_on
+ event.exclusive = frame.exclusive
+
+ # Weight is an integer between 1 and 256, but the byte only allows
+ # 0 to 255: add one.
+ event.weight = frame.stream_weight + 1
+
+ # A stream may not depend on itself.
+ if event.depends_on == frame.stream_id:
+ raise ProtocolError(
+ "Stream %d may not depend on itself" % frame.stream_id
+ )
+ events.append(event)
+
+ return [], events
+
+ def _receive_goaway_frame(self, frame):
+ """
+ Receive a GOAWAY frame on the connection.
+ """
+ events = self.state_machine.process_input(
+ ConnectionInputs.RECV_GOAWAY
+ )
+
+ # Clear the outbound data buffer: we cannot send further data now.
+ self.clear_outbound_data_buffer()
+
+ # Fire an appropriate ConnectionTerminated event.
+ new_event = ConnectionTerminated()
+ new_event.error_code = _error_code_from_int(frame.error_code)
+ new_event.last_stream_id = frame.last_stream_id
+ new_event.additional_data = (frame.additional_data
+ if frame.additional_data else None)
+ events.append(new_event)
+
+ return [], events
+
+ def _receive_naked_continuation(self, frame):
+ """
+ A naked CONTINUATION frame has been received. This is always an error,
+ but the type of error it is depends on the state of the stream and must
+ transition the state of the stream, so we need to pass it to the
+ appropriate stream.
+ """
+ stream = self._get_stream_by_id(frame.stream_id)
+ stream.receive_continuation()
+ assert False, "Should not be reachable"
+
+ def _receive_alt_svc_frame(self, frame):
+ """
+ An ALTSVC frame has been received. This frame, specified in RFC 7838,
+ is used to advertise alternative places where the same service can be
+ reached.
+
+ This frame can optionally be received either on a stream or on stream
+ 0, and its semantics are different in each case.
+ """
+ events = self.state_machine.process_input(
+ ConnectionInputs.RECV_ALTERNATIVE_SERVICE
+ )
+ frames = []
+
+ if frame.stream_id:
+ # Given that it makes no sense to receive ALTSVC on a stream
+ # before that stream has been opened with a HEADERS frame, the
+ # ALTSVC frame cannot create a stream. If the stream is not
+ # present, we simply ignore the frame.
+ try:
+ stream = self._get_stream_by_id(frame.stream_id)
+ except (NoSuchStreamError, StreamClosedError):
+ pass
+ else:
+ stream_frames, stream_events = stream.receive_alt_svc(frame)
+ frames.extend(stream_frames)
+ events.extend(stream_events)
+ else:
+ # This frame is sent on stream 0. The origin field on the frame
+ # must be present, though if it isn't it's not a ProtocolError
+ # (annoyingly), we just need to ignore it.
+ if not frame.origin:
+ return frames, events
+
+ # If we're a server, we want to ignore this (RFC 7838 says so).
+ if not self.config.client_side:
+ return frames, events
+
+ event = AlternativeServiceAvailable()
+ event.origin = frame.origin
+ event.field_value = frame.field
+ events.append(event)
+
+ return frames, events
+
+ def _receive_unknown_frame(self, frame):
+ """
+ We have received a frame that we do not understand. This is almost
+ certainly an extension frame, though it's impossible to be entirely
+ sure.
+
+ RFC 7540 § 5.5 says that we MUST ignore unknown frame types: so we
+ do. We do notify the user that we received one, however.
+ """
+ # All we do here is log.
+ self.config.logger.debug(
+ "Received unknown extension frame (ID %d)", frame.stream_id
+ )
+ event = UnknownFrameReceived()
+ event.frame = frame
+ return [], [event]
+
+ def _local_settings_acked(self):
+ """
+ Handle the local settings being ACKed, update internal state.
+ """
+ changes = self.local_settings.acknowledge()
+
+ if SettingCodes.INITIAL_WINDOW_SIZE in changes:
+ setting = changes[SettingCodes.INITIAL_WINDOW_SIZE]
+ self._inbound_flow_control_change_from_settings(
+ setting.original_value,
+ setting.new_value,
+ )
+
+ if SettingCodes.MAX_HEADER_LIST_SIZE in changes:
+ setting = changes[SettingCodes.MAX_HEADER_LIST_SIZE]
+ self.decoder.max_header_list_size = setting.new_value
+
+ if SettingCodes.MAX_FRAME_SIZE in changes:
+ setting = changes[SettingCodes.MAX_FRAME_SIZE]
+ self.max_inbound_frame_size = setting.new_value
+
+ if SettingCodes.HEADER_TABLE_SIZE in changes:
+ setting = changes[SettingCodes.HEADER_TABLE_SIZE]
+ # This is safe across all hpack versions: some versions just won't
+ # respect it.
+ self.decoder.max_allowed_table_size = setting.new_value
+
+ return changes
+
+ def _stream_id_is_outbound(self, stream_id):
+ """
+ Returns ``True`` if the stream ID corresponds to an outbound stream
+ (one initiated by this peer), returns ``False`` otherwise.
+ """
+ return (stream_id % 2 == int(self.config.client_side))
+
+ def _stream_closed_by(self, stream_id):
+ """
+ Returns how the stream was closed.
+
+ The return value will be either a member of
+ ``h2.stream.StreamClosedBy`` or ``None``. If ``None``, the stream was
+ closed implicitly by the peer opening a stream with a higher stream ID
+ before opening this one.
+ """
+ if stream_id in self.streams:
+ return self.streams[stream_id].closed_by
+ if stream_id in self._closed_streams:
+ return self._closed_streams[stream_id]
+ return None
+
+ def _stream_is_closed_by_reset(self, stream_id):
+ """
+ Returns ``True`` if the stream was closed by sending or receiving a
+ RST_STREAM frame. Returns ``False`` otherwise.
+ """
+ return self._stream_closed_by(stream_id) in (
+ StreamClosedBy.RECV_RST_STREAM, StreamClosedBy.SEND_RST_STREAM
+ )
+
+ def _stream_is_closed_by_end(self, stream_id):
+ """
+ Returns ``True`` if the stream was closed by sending or receiving an
+ END_STREAM flag in a HEADERS or DATA frame. Returns ``False``
+ otherwise.
+ """
+ return self._stream_closed_by(stream_id) in (
+ StreamClosedBy.RECV_END_STREAM, StreamClosedBy.SEND_END_STREAM
+ )
+
+
+def _add_frame_priority(frame, weight=None, depends_on=None, exclusive=None):
+ """
+ Adds priority data to a given frame. Does not change any flags set on that
+ frame: if the caller is adding priority information to a HEADERS frame they
+ must set that themselves.
+
+ This method also deliberately sets defaults for anything missing.
+
+ This method validates the input values.
+ """
+ # A stream may not depend on itself.
+ if depends_on == frame.stream_id:
+ raise ProtocolError(
+ "Stream %d may not depend on itself" % frame.stream_id
+ )
+
+ # Weight must be between 1 and 256.
+ if weight is not None:
+ if weight > 256 or weight < 1:
+ raise ProtocolError(
+ "Weight must be between 1 and 256, not %d" % weight
+ )
+ else:
+ # Weight is an integer between 1 and 256, but the byte only allows
+ # 0 to 255: subtract one.
+ weight -= 1
+
+ # Set defaults for anything not provided.
+ weight = weight if weight is not None else 15
+ depends_on = depends_on if depends_on is not None else 0
+ exclusive = exclusive if exclusive is not None else False
+
+ frame.stream_weight = weight
+ frame.depends_on = depends_on
+ frame.exclusive = exclusive
+
+ return frame
+
+
+def _decode_headers(decoder, encoded_header_block):
+ """
+ Decode a HPACK-encoded header block, translating HPACK exceptions into
+ sensible hyper-h2 errors.
+
+ This only ever returns bytestring headers: hyper-h2 may emit them as
+ unicode later, but internally it processes them as bytestrings only.
+ """
+ try:
+ return decoder.decode(encoded_header_block, raw=True)
+ except OversizedHeaderListError as e:
+ # This is a symptom of a HPACK bomb attack: the user has
+ # disregarded our requirements on how large a header block we'll
+ # accept.
+ raise DenialOfServiceError("Oversized header block: %s" % e)
+ except (HPACKError, IndexError, TypeError, UnicodeDecodeError) as e:
+ # We should only need HPACKError here, but versions of HPACK older
+ # than 2.1.0 throw all three others as well. For maximum
+ # compatibility, catch all of them.
+ raise ProtocolError("Error decoding header block: %s" % e)
diff --git a/.venv/lib/python3.9/site-packages/h2/errors.py b/.venv/lib/python3.9/site-packages/h2/errors.py
new file mode 100644
index 0000000..303df59
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2/errors.py
@@ -0,0 +1,75 @@
+# -*- coding: utf-8 -*-
+"""
+h2/errors
+~~~~~~~~~
+
+Global error code registry containing the established HTTP/2 error codes.
+
+The current registry is available at:
+https://tools.ietf.org/html/rfc7540#section-11.4
+"""
+import enum
+
+
+class ErrorCodes(enum.IntEnum):
+ """
+ All known HTTP/2 error codes.
+
+ .. versionadded:: 2.5.0
+ """
+ #: Graceful shutdown.
+ NO_ERROR = 0x0
+
+ #: Protocol error detected.
+ PROTOCOL_ERROR = 0x1
+
+ #: Implementation fault.
+ INTERNAL_ERROR = 0x2
+
+ #: Flow-control limits exceeded.
+ FLOW_CONTROL_ERROR = 0x3
+
+ #: Settings not acknowledged.
+ SETTINGS_TIMEOUT = 0x4
+
+ #: Frame received for closed stream.
+ STREAM_CLOSED = 0x5
+
+ #: Frame size incorrect.
+ FRAME_SIZE_ERROR = 0x6
+
+ #: Stream not processed.
+ REFUSED_STREAM = 0x7
+
+ #: Stream cancelled.
+ CANCEL = 0x8
+
+ #: Compression state not updated.
+ COMPRESSION_ERROR = 0x9
+
+ #: TCP connection error for CONNECT method.
+ CONNECT_ERROR = 0xa
+
+ #: Processing capacity exceeded.
+ ENHANCE_YOUR_CALM = 0xb
+
+ #: Negotiated TLS parameters not acceptable.
+ INADEQUATE_SECURITY = 0xc
+
+ #: Use HTTP/1.1 for the request.
+ HTTP_1_1_REQUIRED = 0xd
+
+
+def _error_code_from_int(code):
+ """
+ Given an integer error code, returns either one of :class:`ErrorCodes
+ ` or, if not present in the known set of codes,
+ returns the integer directly.
+ """
+ try:
+ return ErrorCodes(code)
+ except ValueError:
+ return code
+
+
+__all__ = ['ErrorCodes']
diff --git a/.venv/lib/python3.9/site-packages/h2/events.py b/.venv/lib/python3.9/site-packages/h2/events.py
new file mode 100644
index 0000000..08b3186
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2/events.py
@@ -0,0 +1,634 @@
+# -*- coding: utf-8 -*-
+"""
+h2/events
+~~~~~~~~~
+
+Defines Event types for HTTP/2.
+
+Events are returned by the H2 state machine to allow implementations to keep
+track of events triggered by receiving data. Each time data is provided to the
+H2 state machine it processes the data and returns a list of Event objects.
+"""
+import binascii
+
+from .settings import ChangedSetting, _setting_code_from_int
+
+
+class Event:
+ """
+ Base class for h2 events.
+ """
+ pass
+
+
+class RequestReceived(Event):
+ """
+ The RequestReceived event is fired whenever request headers are received.
+ This event carries the HTTP headers for the given request and the stream ID
+ of the new stream.
+
+ .. versionchanged:: 2.3.0
+ Changed the type of ``headers`` to :class:`HeaderTuple
+ `. This has no effect on current users.
+
+ .. versionchanged:: 2.4.0
+ Added ``stream_ended`` and ``priority_updated`` properties.
+ """
+ def __init__(self):
+ #: The Stream ID for the stream this request was made on.
+ self.stream_id = None
+
+ #: The request headers.
+ self.headers = None
+
+ #: If this request also ended the stream, the associated
+ #: :class:`StreamEnded ` event will be available
+ #: here.
+ #:
+ #: .. versionadded:: 2.4.0
+ self.stream_ended = None
+
+ #: If this request also had associated priority information, the
+ #: associated :class:`PriorityUpdated `
+ #: event will be available here.
+ #:
+ #: .. versionadded:: 2.4.0
+ self.priority_updated = None
+
+ def __repr__(self):
+ return "" % (
+ self.stream_id, self.headers
+ )
+
+
+class ResponseReceived(Event):
+ """
+ The ResponseReceived event is fired whenever response headers are received.
+ This event carries the HTTP headers for the given response and the stream
+ ID of the new stream.
+
+ .. versionchanged:: 2.3.0
+ Changed the type of ``headers`` to :class:`HeaderTuple
+ `. This has no effect on current users.
+
+ .. versionchanged:: 2.4.0
+ Added ``stream_ended`` and ``priority_updated`` properties.
+ """
+ def __init__(self):
+ #: The Stream ID for the stream this response was made on.
+ self.stream_id = None
+
+ #: The response headers.
+ self.headers = None
+
+ #: If this response also ended the stream, the associated
+ #: :class:`StreamEnded ` event will be available
+ #: here.
+ #:
+ #: .. versionadded:: 2.4.0
+ self.stream_ended = None
+
+ #: If this response also had associated priority information, the
+ #: associated :class:`PriorityUpdated `
+ #: event will be available here.
+ #:
+ #: .. versionadded:: 2.4.0
+ self.priority_updated = None
+
+ def __repr__(self):
+ return "" % (
+ self.stream_id, self.headers
+ )
+
+
+class TrailersReceived(Event):
+ """
+ The TrailersReceived event is fired whenever trailers are received on a
+ stream. Trailers are a set of headers sent after the body of the
+ request/response, and are used to provide information that wasn't known
+ ahead of time (e.g. content-length). This event carries the HTTP header
+ fields that form the trailers and the stream ID of the stream on which they
+ were received.
+
+ .. versionchanged:: 2.3.0
+ Changed the type of ``headers`` to :class:`HeaderTuple
+ `. This has no effect on current users.
+
+ .. versionchanged:: 2.4.0
+ Added ``stream_ended`` and ``priority_updated`` properties.
+ """
+ def __init__(self):
+ #: The Stream ID for the stream on which these trailers were received.
+ self.stream_id = None
+
+ #: The trailers themselves.
+ self.headers = None
+
+ #: Trailers always end streams. This property has the associated
+ #: :class:`StreamEnded ` in it.
+ #:
+ #: .. versionadded:: 2.4.0
+ self.stream_ended = None
+
+ #: If the trailers also set associated priority information, the
+ #: associated :class:`PriorityUpdated `
+ #: event will be available here.
+ #:
+ #: .. versionadded:: 2.4.0
+ self.priority_updated = None
+
+ def __repr__(self):
+ return "" % (
+ self.stream_id, self.headers
+ )
+
+
+class _HeadersSent(Event):
+ """
+ The _HeadersSent event is fired whenever headers are sent.
+
+ This is an internal event, used to determine validation steps on
+ outgoing header blocks.
+ """
+ pass
+
+
+class _ResponseSent(_HeadersSent):
+ """
+ The _ResponseSent event is fired whenever response headers are sent
+ on a stream.
+
+ This is an internal event, used to determine validation steps on
+ outgoing header blocks.
+ """
+ pass
+
+
+class _RequestSent(_HeadersSent):
+ """
+ The _RequestSent event is fired whenever request headers are sent
+ on a stream.
+
+ This is an internal event, used to determine validation steps on
+ outgoing header blocks.
+ """
+ pass
+
+
+class _TrailersSent(_HeadersSent):
+ """
+ The _TrailersSent event is fired whenever trailers are sent on a
+ stream. Trailers are a set of headers sent after the body of the
+ request/response, and are used to provide information that wasn't known
+ ahead of time (e.g. content-length).
+
+ This is an internal event, used to determine validation steps on
+ outgoing header blocks.
+ """
+ pass
+
+
+class _PushedRequestSent(_HeadersSent):
+ """
+ The _PushedRequestSent event is fired whenever pushed request headers are
+ sent.
+
+ This is an internal event, used to determine validation steps on outgoing
+ header blocks.
+ """
+ pass
+
+
+class InformationalResponseReceived(Event):
+ """
+ The InformationalResponseReceived event is fired when an informational
+ response (that is, one whose status code is a 1XX code) is received from
+ the remote peer.
+
+ The remote peer may send any number of these, from zero upwards. These
+ responses are most commonly sent in response to requests that have the
+ ``expect: 100-continue`` header field present. Most users can safely
+ ignore this event unless you are intending to use the
+ ``expect: 100-continue`` flow, or are for any reason expecting a different
+ 1XX status code.
+
+ .. versionadded:: 2.2.0
+
+ .. versionchanged:: 2.3.0
+ Changed the type of ``headers`` to :class:`HeaderTuple
+ `. This has no effect on current users.
+
+ .. versionchanged:: 2.4.0
+ Added ``priority_updated`` property.
+ """
+ def __init__(self):
+ #: The Stream ID for the stream this informational response was made
+ #: on.
+ self.stream_id = None
+
+ #: The headers for this informational response.
+ self.headers = None
+
+ #: If this response also had associated priority information, the
+ #: associated :class:`PriorityUpdated `
+ #: event will be available here.
+ #:
+ #: .. versionadded:: 2.4.0
+ self.priority_updated = None
+
+ def __repr__(self):
+ return "" % (
+ self.stream_id, self.headers
+ )
+
+
+class DataReceived(Event):
+ """
+ The DataReceived event is fired whenever data is received on a stream from
+ the remote peer. The event carries the data itself, and the stream ID on
+ which the data was received.
+
+ .. versionchanged:: 2.4.0
+ Added ``stream_ended`` property.
+ """
+ def __init__(self):
+ #: The Stream ID for the stream this data was received on.
+ self.stream_id = None
+
+ #: The data itself.
+ self.data = None
+
+ #: The amount of data received that counts against the flow control
+ #: window. Note that padding counts against the flow control window, so
+ #: when adjusting flow control you should always use this field rather
+ #: than ``len(data)``.
+ self.flow_controlled_length = None
+
+ #: If this data chunk also completed the stream, the associated
+ #: :class:`StreamEnded ` event will be available
+ #: here.
+ #:
+ #: .. versionadded:: 2.4.0
+ self.stream_ended = None
+
+ def __repr__(self):
+ return (
+ "" % (
+ self.stream_id,
+ self.flow_controlled_length,
+ _bytes_representation(self.data[:20]),
+ )
+ )
+
+
+class WindowUpdated(Event):
+ """
+ The WindowUpdated event is fired whenever a flow control window changes
+ size. HTTP/2 defines flow control windows for connections and streams: this
+ event fires for both connections and streams. The event carries the ID of
+ the stream to which it applies (set to zero if the window update applies to
+ the connection), and the delta in the window size.
+ """
+ def __init__(self):
+ #: The Stream ID of the stream whose flow control window was changed.
+ #: May be ``0`` if the connection window was changed.
+ self.stream_id = None
+
+ #: The window delta.
+ self.delta = None
+
+ def __repr__(self):
+ return "" % (
+ self.stream_id, self.delta
+ )
+
+
+class RemoteSettingsChanged(Event):
+ """
+ The RemoteSettingsChanged event is fired whenever the remote peer changes
+ its settings. It contains a complete inventory of changed settings,
+ including their previous values.
+
+ In HTTP/2, settings changes need to be acknowledged. hyper-h2 automatically
+ acknowledges settings changes for efficiency. However, it is possible that
+ the caller may not be happy with the changed setting.
+
+ When this event is received, the caller should confirm that the new
+ settings are acceptable. If they are not acceptable, the user should close
+ the connection with the error code :data:`PROTOCOL_ERROR
+ `.
+
+ .. versionchanged:: 2.0.0
+ Prior to this version the user needed to acknowledge settings changes.
+ This is no longer the case: hyper-h2 now automatically acknowledges
+ them.
+ """
+ def __init__(self):
+ #: A dictionary of setting byte to
+ #: :class:`ChangedSetting `, representing
+ #: the changed settings.
+ self.changed_settings = {}
+
+ @classmethod
+ def from_settings(cls, old_settings, new_settings):
+ """
+ Build a RemoteSettingsChanged event from a set of changed settings.
+
+ :param old_settings: A complete collection of old settings, in the form
+ of a dictionary of ``{setting: value}``.
+ :param new_settings: All the changed settings and their new values, in
+ the form of a dictionary of ``{setting: value}``.
+ """
+ e = cls()
+ for setting, new_value in new_settings.items():
+ setting = _setting_code_from_int(setting)
+ original_value = old_settings.get(setting)
+ change = ChangedSetting(setting, original_value, new_value)
+ e.changed_settings[setting] = change
+
+ return e
+
+ def __repr__(self):
+ return "" % (
+ ", ".join(repr(cs) for cs in self.changed_settings.values()),
+ )
+
+
+class PingReceived(Event):
+ """
+ The PingReceived event is fired whenever a PING is received. It contains
+ the 'opaque data' of the PING frame. A ping acknowledgment with the same
+ 'opaque data' is automatically emitted after receiving a ping.
+
+ .. versionadded:: 3.1.0
+ """
+ def __init__(self):
+ #: The data included on the ping.
+ self.ping_data = None
+
+ def __repr__(self):
+ return "" % (
+ _bytes_representation(self.ping_data),
+ )
+
+
+class PingAckReceived(Event):
+ """
+ The PingAckReceived event is fired whenever a PING acknowledgment is
+ received. It contains the 'opaque data' of the PING+ACK frame, allowing the
+ user to correlate PINGs and calculate RTT.
+
+ .. versionadded:: 3.1.0
+
+ .. versionchanged:: 4.0.0
+ Removed deprecated but equivalent ``PingAcknowledged``.
+ """
+ def __init__(self):
+ #: The data included on the ping.
+ self.ping_data = None
+
+ def __repr__(self):
+ return "" % (
+ _bytes_representation(self.ping_data),
+ )
+
+
+class StreamEnded(Event):
+ """
+ The StreamEnded event is fired whenever a stream is ended by a remote
+ party. The stream may not be fully closed if it has not been closed
+ locally, but no further data or headers should be expected on that stream.
+ """
+ def __init__(self):
+ #: The Stream ID of the stream that was closed.
+ self.stream_id = None
+
+ def __repr__(self):
+ return "" % self.stream_id
+
+
+class StreamReset(Event):
+ """
+ The StreamReset event is fired in two situations. The first is when the
+ remote party forcefully resets the stream. The second is when the remote
+ party has made a protocol error which only affects a single stream. In this
+ case, Hyper-h2 will terminate the stream early and return this event.
+
+ .. versionchanged:: 2.0.0
+ This event is now fired when Hyper-h2 automatically resets a stream.
+ """
+ def __init__(self):
+ #: The Stream ID of the stream that was reset.
+ self.stream_id = None
+
+ #: The error code given. Either one of :class:`ErrorCodes
+ #: ` or ``int``
+ self.error_code = None
+
+ #: Whether the remote peer sent a RST_STREAM or we did.
+ self.remote_reset = True
+
+ def __repr__(self):
+ return "" % (
+ self.stream_id, self.error_code, self.remote_reset
+ )
+
+
+class PushedStreamReceived(Event):
+ """
+ The PushedStreamReceived event is fired whenever a pushed stream has been
+ received from a remote peer. The event carries on it the new stream ID, the
+ ID of the parent stream, and the request headers pushed by the remote peer.
+ """
+ def __init__(self):
+ #: The Stream ID of the stream created by the push.
+ self.pushed_stream_id = None
+
+ #: The Stream ID of the stream that the push is related to.
+ self.parent_stream_id = None
+
+ #: The request headers, sent by the remote party in the push.
+ self.headers = None
+
+ def __repr__(self):
+ return (
+ "" % (
+ self.pushed_stream_id,
+ self.parent_stream_id,
+ self.headers,
+ )
+ )
+
+
+class SettingsAcknowledged(Event):
+ """
+ The SettingsAcknowledged event is fired whenever a settings ACK is received
+ from the remote peer. The event carries on it the settings that were
+ acknowedged, in the same format as
+ :class:`h2.events.RemoteSettingsChanged`.
+ """
+ def __init__(self):
+ #: A dictionary of setting byte to
+ #: :class:`ChangedSetting `, representing
+ #: the changed settings.
+ self.changed_settings = {}
+
+ def __repr__(self):
+ return "" % (
+ ", ".join(repr(cs) for cs in self.changed_settings.values()),
+ )
+
+
+class PriorityUpdated(Event):
+ """
+ The PriorityUpdated event is fired whenever a stream sends updated priority
+ information. This can occur when the stream is opened, or at any time
+ during the stream lifetime.
+
+ This event is purely advisory, and does not need to be acted on.
+
+ .. versionadded:: 2.0.0
+ """
+ def __init__(self):
+ #: The ID of the stream whose priority information is being updated.
+ self.stream_id = None
+
+ #: The new stream weight. May be the same as the original stream
+ #: weight. An integer between 1 and 256.
+ self.weight = None
+
+ #: The stream ID this stream now depends on. May be ``0``.
+ self.depends_on = None
+
+ #: Whether the stream *exclusively* depends on the parent stream. If it
+ #: does, this stream should inherit the current children of its new
+ #: parent.
+ self.exclusive = None
+
+ def __repr__(self):
+ return (
+ "" % (
+ self.stream_id,
+ self.weight,
+ self.depends_on,
+ self.exclusive
+ )
+ )
+
+
+class ConnectionTerminated(Event):
+ """
+ The ConnectionTerminated event is fired when a connection is torn down by
+ the remote peer using a GOAWAY frame. Once received, no further action may
+ be taken on the connection: a new connection must be established.
+ """
+ def __init__(self):
+ #: The error code cited when tearing down the connection. Should be
+ #: one of :class:`ErrorCodes `, but may not be if
+ #: unknown HTTP/2 extensions are being used.
+ self.error_code = None
+
+ #: The stream ID of the last stream the remote peer saw. This can
+ #: provide an indication of what data, if any, never reached the remote
+ #: peer and so can safely be resent.
+ self.last_stream_id = None
+
+ #: Additional debug data that can be appended to GOAWAY frame.
+ self.additional_data = None
+
+ def __repr__(self):
+ return (
+ "" % (
+ self.error_code,
+ self.last_stream_id,
+ _bytes_representation(
+ self.additional_data[:20]
+ if self.additional_data else None)
+ )
+ )
+
+
+class AlternativeServiceAvailable(Event):
+ """
+ The AlternativeServiceAvailable event is fired when the remote peer
+ advertises an `RFC 7838 `_ Alternative
+ Service using an ALTSVC frame.
+
+ This event always carries the origin to which the ALTSVC information
+ applies. That origin is either supplied by the server directly, or inferred
+ by hyper-h2 from the ``:authority`` pseudo-header field that was sent by
+ the user when initiating a given stream.
+
+ This event also carries what RFC 7838 calls the "Alternative Service Field
+ Value", which is formatted like a HTTP header field and contains the
+ relevant alternative service information. Hyper-h2 does not parse or in any
+ way modify that information: the user is required to do that.
+
+ This event can only be fired on the client end of a connection.
+
+ .. versionadded:: 2.3.0
+ """
+ def __init__(self):
+ #: The origin to which the alternative service field value applies.
+ #: This field is either supplied by the server directly, or inferred by
+ #: hyper-h2 from the ``:authority`` pseudo-header field that was sent
+ #: by the user when initiating the stream on which the frame was
+ #: received.
+ self.origin = None
+
+ #: The ALTSVC field value. This contains information about the HTTP
+ #: alternative service being advertised by the server. Hyper-h2 does
+ #: not parse this field: it is left exactly as sent by the server. The
+ #: structure of the data in this field is given by `RFC 7838 Section 3
+ #: `_.
+ self.field_value = None
+
+ def __repr__(self):
+ return (
+ "" % (
+ self.origin.decode('utf-8', 'ignore'),
+ self.field_value.decode('utf-8', 'ignore'),
+ )
+ )
+
+
+class UnknownFrameReceived(Event):
+ """
+ The UnknownFrameReceived event is fired when the remote peer sends a frame
+ that hyper-h2 does not understand. This occurs primarily when the remote
+ peer is employing HTTP/2 extensions that hyper-h2 doesn't know anything
+ about.
+
+ RFC 7540 requires that HTTP/2 implementations ignore these frames. hyper-h2
+ does so. However, this event is fired to allow implementations to perform
+ special processing on those frames if needed (e.g. if the implementation
+ is capable of handling the frame itself).
+
+ .. versionadded:: 2.7.0
+ """
+ def __init__(self):
+ #: The hyperframe Frame object that encapsulates the received frame.
+ self.frame = None
+
+ def __repr__(self):
+ return ""
+
+
+def _bytes_representation(data):
+ """
+ Converts a bytestring into something that is safe to print on all Python
+ platforms.
+
+ This function is relatively expensive, so it should not be called on the
+ mainline of the code. It's safe to use in things like object repr methods
+ though.
+ """
+ if data is None:
+ return None
+
+ return binascii.hexlify(data).decode('ascii')
diff --git a/.venv/lib/python3.9/site-packages/h2/exceptions.py b/.venv/lib/python3.9/site-packages/h2/exceptions.py
new file mode 100644
index 0000000..e22bebc
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2/exceptions.py
@@ -0,0 +1,187 @@
+# -*- coding: utf-8 -*-
+"""
+h2/exceptions
+~~~~~~~~~~~~~
+
+Exceptions for the HTTP/2 module.
+"""
+import h2.errors
+
+
+class H2Error(Exception):
+ """
+ The base class for all exceptions for the HTTP/2 module.
+ """
+
+
+class ProtocolError(H2Error):
+ """
+ An action was attempted in violation of the HTTP/2 protocol.
+ """
+ #: The error code corresponds to this kind of Protocol Error.
+ error_code = h2.errors.ErrorCodes.PROTOCOL_ERROR
+
+
+class FrameTooLargeError(ProtocolError):
+ """
+ The frame that we tried to send or that we received was too large.
+ """
+ #: The error code corresponds to this kind of Protocol Error.
+ error_code = h2.errors.ErrorCodes.FRAME_SIZE_ERROR
+
+
+class FrameDataMissingError(ProtocolError):
+ """
+ The frame that we received is missing some data.
+
+ .. versionadded:: 2.0.0
+ """
+ #: The error code corresponds to this kind of Protocol Error.
+ error_code = h2.errors.ErrorCodes.FRAME_SIZE_ERROR
+
+
+class TooManyStreamsError(ProtocolError):
+ """
+ An attempt was made to open a stream that would lead to too many concurrent
+ streams.
+ """
+ pass
+
+
+class FlowControlError(ProtocolError):
+ """
+ An attempted action violates flow control constraints.
+ """
+ #: The error code corresponds to this kind of Protocol Error.
+ error_code = h2.errors.ErrorCodes.FLOW_CONTROL_ERROR
+
+
+class StreamIDTooLowError(ProtocolError):
+ """
+ An attempt was made to open a stream that had an ID that is lower than the
+ highest ID we have seen on this connection.
+ """
+ def __init__(self, stream_id, max_stream_id):
+ #: The ID of the stream that we attempted to open.
+ self.stream_id = stream_id
+
+ #: The current highest-seen stream ID.
+ self.max_stream_id = max_stream_id
+
+ def __str__(self):
+ return "StreamIDTooLowError: %d is lower than %d" % (
+ self.stream_id, self.max_stream_id
+ )
+
+
+class NoAvailableStreamIDError(ProtocolError):
+ """
+ There are no available stream IDs left to the connection. All stream IDs
+ have been exhausted.
+
+ .. versionadded:: 2.0.0
+ """
+ pass
+
+
+class NoSuchStreamError(ProtocolError):
+ """
+ A stream-specific action referenced a stream that does not exist.
+
+ .. versionchanged:: 2.0.0
+ Became a subclass of :class:`ProtocolError
+ `
+ """
+ def __init__(self, stream_id):
+ #: The stream ID corresponds to the non-existent stream.
+ self.stream_id = stream_id
+
+
+class StreamClosedError(NoSuchStreamError):
+ """
+ A more specific form of
+ :class:`NoSuchStreamError `. Indicates
+ that the stream has since been closed, and that all state relating to that
+ stream has been removed.
+ """
+ def __init__(self, stream_id):
+ #: The stream ID corresponds to the nonexistent stream.
+ self.stream_id = stream_id
+
+ #: The relevant HTTP/2 error code.
+ self.error_code = h2.errors.ErrorCodes.STREAM_CLOSED
+
+ # Any events that internal code may need to fire. Not relevant to
+ # external users that may receive a StreamClosedError.
+ self._events = []
+
+
+class InvalidSettingsValueError(ProtocolError, ValueError):
+ """
+ An attempt was made to set an invalid Settings value.
+
+ .. versionadded:: 2.0.0
+ """
+ def __init__(self, msg, error_code):
+ super(InvalidSettingsValueError, self).__init__(msg)
+ self.error_code = error_code
+
+
+class InvalidBodyLengthError(ProtocolError):
+ """
+ The remote peer sent more or less data that the Content-Length header
+ indicated.
+
+ .. versionadded:: 2.0.0
+ """
+ def __init__(self, expected, actual):
+ self.expected_length = expected
+ self.actual_length = actual
+
+ def __str__(self):
+ return "InvalidBodyLengthError: Expected %d bytes, received %d" % (
+ self.expected_length, self.actual_length
+ )
+
+
+class UnsupportedFrameError(ProtocolError):
+ """
+ The remote peer sent a frame that is unsupported in this context.
+
+ .. versionadded:: 2.1.0
+
+ .. versionchanged:: 4.0.0
+ Removed deprecated KeyError parent class.
+ """
+ pass
+
+
+class RFC1122Error(H2Error):
+ """
+ Emitted when users attempt to do something that is literally allowed by the
+ relevant RFC, but is sufficiently ill-defined that it's unwise to allow
+ users to actually do it.
+
+ While there is some disagreement about whether or not we should be liberal
+ in what accept, it is a truth universally acknowledged that we should be
+ conservative in what emit.
+
+ .. versionadded:: 2.4.0
+ """
+ # shazow says I'm going to regret naming the exception this way. If that
+ # turns out to be true, TELL HIM NOTHING.
+ pass
+
+
+class DenialOfServiceError(ProtocolError):
+ """
+ Emitted when the remote peer exhibits a behaviour that is likely to be an
+ attempt to perform a Denial of Service attack on the implementation. This
+ is a form of ProtocolError that carries a different error code, and allows
+ more easy detection of this kind of behaviour.
+
+ .. versionadded:: 2.5.0
+ """
+ #: The error code corresponds to this kind of
+ #: :class:`ProtocolError `
+ error_code = h2.errors.ErrorCodes.ENHANCE_YOUR_CALM
diff --git a/.venv/lib/python3.9/site-packages/h2/frame_buffer.py b/.venv/lib/python3.9/site-packages/h2/frame_buffer.py
new file mode 100644
index 0000000..785775e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2/frame_buffer.py
@@ -0,0 +1,160 @@
+# -*- coding: utf-8 -*-
+"""
+h2/frame_buffer
+~~~~~~~~~~~~~~~
+
+A data structure that provides a way to iterate over a byte buffer in terms of
+frames.
+"""
+from hyperframe.exceptions import InvalidFrameError, InvalidDataError
+from hyperframe.frame import (
+ Frame, HeadersFrame, ContinuationFrame, PushPromiseFrame
+)
+
+from .exceptions import (
+ ProtocolError, FrameTooLargeError, FrameDataMissingError
+)
+
+# To avoid a DOS attack based on sending loads of continuation frames, we limit
+# the maximum number we're perpared to receive. In this case, we'll set the
+# limit to 64, which means the largest encoded header block we can receive by
+# default is 262144 bytes long, and the largest possible *at all* is 1073741760
+# bytes long.
+#
+# This value seems reasonable for now, but in future we may want to evaluate
+# making it configurable.
+CONTINUATION_BACKLOG = 64
+
+
+class FrameBuffer:
+ """
+ This is a data structure that expects to act as a buffer for HTTP/2 data
+ that allows iteraton in terms of H2 frames.
+ """
+ def __init__(self, server=False):
+ self.data = b''
+ self.max_frame_size = 0
+ self._preamble = b'PRI * HTTP/2.0\r\n\r\nSM\r\n\r\n' if server else b''
+ self._preamble_len = len(self._preamble)
+ self._headers_buffer = []
+
+ def add_data(self, data):
+ """
+ Add more data to the frame buffer.
+
+ :param data: A bytestring containing the byte buffer.
+ """
+ if self._preamble_len:
+ data_len = len(data)
+ of_which_preamble = min(self._preamble_len, data_len)
+
+ if self._preamble[:of_which_preamble] != data[:of_which_preamble]:
+ raise ProtocolError("Invalid HTTP/2 preamble.")
+
+ data = data[of_which_preamble:]
+ self._preamble_len -= of_which_preamble
+ self._preamble = self._preamble[of_which_preamble:]
+
+ self.data += data
+
+ def _validate_frame_length(self, length):
+ """
+ Confirm that the frame is an appropriate length.
+ """
+ if length > self.max_frame_size:
+ raise FrameTooLargeError(
+ "Received overlong frame: length %d, max %d" %
+ (length, self.max_frame_size)
+ )
+
+ def _update_header_buffer(self, f):
+ """
+ Updates the internal header buffer. Returns a frame that should replace
+ the current one. May throw exceptions if this frame is invalid.
+ """
+ # Check if we're in the middle of a headers block. If we are, this
+ # frame *must* be a CONTINUATION frame with the same stream ID as the
+ # leading HEADERS or PUSH_PROMISE frame. Anything else is a
+ # ProtocolError. If the frame *is* valid, append it to the header
+ # buffer.
+ if self._headers_buffer:
+ stream_id = self._headers_buffer[0].stream_id
+ valid_frame = (
+ f is not None and
+ isinstance(f, ContinuationFrame) and
+ f.stream_id == stream_id
+ )
+ if not valid_frame:
+ raise ProtocolError("Invalid frame during header block.")
+
+ # Append the frame to the buffer.
+ self._headers_buffer.append(f)
+ if len(self._headers_buffer) > CONTINUATION_BACKLOG:
+ raise ProtocolError("Too many continuation frames received.")
+
+ # If this is the end of the header block, then we want to build a
+ # mutant HEADERS frame that's massive. Use the original one we got,
+ # then set END_HEADERS and set its data appopriately. If it's not
+ # the end of the block, lose the current frame: we can't yield it.
+ if 'END_HEADERS' in f.flags:
+ f = self._headers_buffer[0]
+ f.flags.add('END_HEADERS')
+ f.data = b''.join(x.data for x in self._headers_buffer)
+ self._headers_buffer = []
+ else:
+ f = None
+ elif (isinstance(f, (HeadersFrame, PushPromiseFrame)) and
+ 'END_HEADERS' not in f.flags):
+ # This is the start of a headers block! Save the frame off and then
+ # act like we didn't receive one.
+ self._headers_buffer.append(f)
+ f = None
+
+ return f
+
+ # The methods below support the iterator protocol.
+ def __iter__(self):
+ return self
+
+ def __next__(self):
+ # First, check that we have enough data to successfully parse the
+ # next frame header. If not, bail. Otherwise, parse it.
+ if len(self.data) < 9:
+ raise StopIteration()
+
+ try:
+ f, length = Frame.parse_frame_header(self.data[:9])
+ except (InvalidDataError, InvalidFrameError) as e: # pragma: no cover
+ raise ProtocolError(
+ "Received frame with invalid header: %s" % str(e)
+ )
+
+ # Next, check that we have enough length to parse the frame body. If
+ # not, bail, leaving the frame header data in the buffer for next time.
+ if len(self.data) < length + 9:
+ raise StopIteration()
+
+ # Confirm the frame has an appropriate length.
+ self._validate_frame_length(length)
+
+ # Try to parse the frame body
+ try:
+ f.parse_body(memoryview(self.data[9:9+length]))
+ except InvalidDataError:
+ raise ProtocolError("Received frame with non-compliant data")
+ except InvalidFrameError:
+ raise FrameDataMissingError("Frame data missing or invalid")
+
+ # At this point, as we know we'll use or discard the entire frame, we
+ # can update the data.
+ self.data = self.data[9+length:]
+
+ # Pass the frame through the header buffer.
+ f = self._update_header_buffer(f)
+
+ # If we got a frame we didn't understand or shouldn't yield, rather
+ # than return None it'd be better if we just tried to get the next
+ # frame in the sequence instead. Recurse back into ourselves to do
+ # that. This is safe because the amount of work we have to do here is
+ # strictly bounded by the length of the buffer.
+ return f if f is not None else self.__next__()
diff --git a/.venv/lib/python3.9/site-packages/h2/settings.py b/.venv/lib/python3.9/site-packages/h2/settings.py
new file mode 100644
index 0000000..969a162
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2/settings.py
@@ -0,0 +1,334 @@
+# -*- coding: utf-8 -*-
+"""
+h2/settings
+~~~~~~~~~~~
+
+This module contains a HTTP/2 settings object. This object provides a simple
+API for manipulating HTTP/2 settings, keeping track of both the current active
+state of the settings and the unacknowledged future values of the settings.
+"""
+import collections
+from collections.abc import MutableMapping
+import enum
+
+from hyperframe.frame import SettingsFrame
+
+from h2.errors import ErrorCodes
+from h2.exceptions import InvalidSettingsValueError
+
+
+class SettingCodes(enum.IntEnum):
+ """
+ All known HTTP/2 setting codes.
+
+ .. versionadded:: 2.6.0
+ """
+
+ #: Allows the sender to inform the remote endpoint of the maximum size of
+ #: the header compression table used to decode header blocks, in octets.
+ HEADER_TABLE_SIZE = SettingsFrame.HEADER_TABLE_SIZE
+
+ #: This setting can be used to disable server push. To disable server push
+ #: on a client, set this to 0.
+ ENABLE_PUSH = SettingsFrame.ENABLE_PUSH
+
+ #: Indicates the maximum number of concurrent streams that the sender will
+ #: allow.
+ MAX_CONCURRENT_STREAMS = SettingsFrame.MAX_CONCURRENT_STREAMS
+
+ #: Indicates the sender's initial window size (in octets) for stream-level
+ #: flow control.
+ INITIAL_WINDOW_SIZE = SettingsFrame.INITIAL_WINDOW_SIZE
+
+ #: Indicates the size of the largest frame payload that the sender is
+ #: willing to receive, in octets.
+ MAX_FRAME_SIZE = SettingsFrame.MAX_FRAME_SIZE
+
+ #: This advisory setting informs a peer of the maximum size of header list
+ #: that the sender is prepared to accept, in octets. The value is based on
+ #: the uncompressed size of header fields, including the length of the name
+ #: and value in octets plus an overhead of 32 octets for each header field.
+ MAX_HEADER_LIST_SIZE = SettingsFrame.MAX_HEADER_LIST_SIZE
+
+ #: This setting can be used to enable the connect protocol. To enable on a
+ #: client set this to 1.
+ ENABLE_CONNECT_PROTOCOL = SettingsFrame.ENABLE_CONNECT_PROTOCOL
+
+
+def _setting_code_from_int(code):
+ """
+ Given an integer setting code, returns either one of :class:`SettingCodes
+ ` or, if not present in the known set of codes,
+ returns the integer directly.
+ """
+ try:
+ return SettingCodes(code)
+ except ValueError:
+ return code
+
+
+class ChangedSetting:
+
+ def __init__(self, setting, original_value, new_value):
+ #: The setting code given. Either one of :class:`SettingCodes
+ #: ` or ``int``
+ #:
+ #: .. versionchanged:: 2.6.0
+ self.setting = setting
+
+ #: The original value before being changed.
+ self.original_value = original_value
+
+ #: The new value after being changed.
+ self.new_value = new_value
+
+ def __repr__(self):
+ return (
+ "ChangedSetting(setting=%s, original_value=%s, "
+ "new_value=%s)"
+ ) % (
+ self.setting,
+ self.original_value,
+ self.new_value
+ )
+
+
+class Settings(MutableMapping):
+ """
+ An object that encapsulates HTTP/2 settings state.
+
+ HTTP/2 Settings are a complex beast. Each party, remote and local, has its
+ own settings and a view of the other party's settings. When a settings
+ frame is emitted by a peer it cannot assume that the new settings values
+ are in place until the remote peer acknowledges the setting. In principle,
+ multiple settings changes can be "in flight" at the same time, all with
+ different values.
+
+ This object encapsulates this mess. It provides a dict-like interface to
+ settings, which return the *current* values of the settings in question.
+ Additionally, it keeps track of the stack of proposed values: each time an
+ acknowledgement is sent/received, it updates the current values with the
+ stack of proposed values. On top of all that, it validates the values to
+ make sure they're allowed, and raises :class:`InvalidSettingsValueError
+ ` if they are not.
+
+ Finally, this object understands what the default values of the HTTP/2
+ settings are, and sets those defaults appropriately.
+
+ .. versionchanged:: 2.2.0
+ Added the ``initial_values`` parameter.
+
+ .. versionchanged:: 2.5.0
+ Added the ``max_header_list_size`` property.
+
+ :param client: (optional) Whether these settings should be defaulted for a
+ client implementation or a server implementation. Defaults to ``True``.
+ :type client: ``bool``
+ :param initial_values: (optional) Any initial values the user would like
+ set, rather than RFC 7540's defaults.
+ :type initial_vales: ``MutableMapping``
+ """
+ def __init__(self, client=True, initial_values=None):
+ # Backing object for the settings. This is a dictionary of
+ # (setting: [list of values]), where the first value in the list is the
+ # current value of the setting. Strictly this doesn't use lists but
+ # instead uses collections.deque to avoid repeated memory allocations.
+ #
+ # This contains the default values for HTTP/2.
+ self._settings = {
+ SettingCodes.HEADER_TABLE_SIZE: collections.deque([4096]),
+ SettingCodes.ENABLE_PUSH: collections.deque([int(client)]),
+ SettingCodes.INITIAL_WINDOW_SIZE: collections.deque([65535]),
+ SettingCodes.MAX_FRAME_SIZE: collections.deque([16384]),
+ SettingCodes.ENABLE_CONNECT_PROTOCOL: collections.deque([0]),
+ }
+ if initial_values is not None:
+ for key, value in initial_values.items():
+ invalid = _validate_setting(key, value)
+ if invalid:
+ raise InvalidSettingsValueError(
+ "Setting %d has invalid value %d" % (key, value),
+ error_code=invalid
+ )
+ self._settings[key] = collections.deque([value])
+
+ def acknowledge(self):
+ """
+ The settings have been acknowledged, either by the user (remote
+ settings) or by the remote peer (local settings).
+
+ :returns: A dict of {setting: ChangedSetting} that were applied.
+ """
+ changed_settings = {}
+
+ # If there is more than one setting in the list, we have a setting
+ # value outstanding. Update them.
+ for k, v in self._settings.items():
+ if len(v) > 1:
+ old_setting = v.popleft()
+ new_setting = v[0]
+ changed_settings[k] = ChangedSetting(
+ k, old_setting, new_setting
+ )
+
+ return changed_settings
+
+ # Provide easy-access to well known settings.
+ @property
+ def header_table_size(self):
+ """
+ The current value of the :data:`HEADER_TABLE_SIZE
+ ` setting.
+ """
+ return self[SettingCodes.HEADER_TABLE_SIZE]
+
+ @header_table_size.setter
+ def header_table_size(self, value):
+ self[SettingCodes.HEADER_TABLE_SIZE] = value
+
+ @property
+ def enable_push(self):
+ """
+ The current value of the :data:`ENABLE_PUSH
+ ` setting.
+ """
+ return self[SettingCodes.ENABLE_PUSH]
+
+ @enable_push.setter
+ def enable_push(self, value):
+ self[SettingCodes.ENABLE_PUSH] = value
+
+ @property
+ def initial_window_size(self):
+ """
+ The current value of the :data:`INITIAL_WINDOW_SIZE
+ ` setting.
+ """
+ return self[SettingCodes.INITIAL_WINDOW_SIZE]
+
+ @initial_window_size.setter
+ def initial_window_size(self, value):
+ self[SettingCodes.INITIAL_WINDOW_SIZE] = value
+
+ @property
+ def max_frame_size(self):
+ """
+ The current value of the :data:`MAX_FRAME_SIZE
+ ` setting.
+ """
+ return self[SettingCodes.MAX_FRAME_SIZE]
+
+ @max_frame_size.setter
+ def max_frame_size(self, value):
+ self[SettingCodes.MAX_FRAME_SIZE] = value
+
+ @property
+ def max_concurrent_streams(self):
+ """
+ The current value of the :data:`MAX_CONCURRENT_STREAMS
+ ` setting.
+ """
+ return self.get(SettingCodes.MAX_CONCURRENT_STREAMS, 2**32+1)
+
+ @max_concurrent_streams.setter
+ def max_concurrent_streams(self, value):
+ self[SettingCodes.MAX_CONCURRENT_STREAMS] = value
+
+ @property
+ def max_header_list_size(self):
+ """
+ The current value of the :data:`MAX_HEADER_LIST_SIZE
+ ` setting. If not set,
+ returns ``None``, which means unlimited.
+
+ .. versionadded:: 2.5.0
+ """
+ return self.get(SettingCodes.MAX_HEADER_LIST_SIZE, None)
+
+ @max_header_list_size.setter
+ def max_header_list_size(self, value):
+ self[SettingCodes.MAX_HEADER_LIST_SIZE] = value
+
+ @property
+ def enable_connect_protocol(self):
+ """
+ The current value of the :data:`ENABLE_CONNECT_PROTOCOL
+ ` setting.
+ """
+ return self[SettingCodes.ENABLE_CONNECT_PROTOCOL]
+
+ @enable_connect_protocol.setter
+ def enable_connect_protocol(self, value):
+ self[SettingCodes.ENABLE_CONNECT_PROTOCOL] = value
+
+ # Implement the MutableMapping API.
+ def __getitem__(self, key):
+ val = self._settings[key][0]
+
+ # Things that were created when a setting was received should stay
+ # KeyError'd.
+ if val is None:
+ raise KeyError
+
+ return val
+
+ def __setitem__(self, key, value):
+ invalid = _validate_setting(key, value)
+ if invalid:
+ raise InvalidSettingsValueError(
+ "Setting %d has invalid value %d" % (key, value),
+ error_code=invalid
+ )
+
+ try:
+ items = self._settings[key]
+ except KeyError:
+ items = collections.deque([None])
+ self._settings[key] = items
+
+ items.append(value)
+
+ def __delitem__(self, key):
+ del self._settings[key]
+
+ def __iter__(self):
+ return self._settings.__iter__()
+
+ def __len__(self):
+ return len(self._settings)
+
+ def __eq__(self, other):
+ if isinstance(other, Settings):
+ return self._settings == other._settings
+ else:
+ return NotImplemented
+
+ def __ne__(self, other):
+ if isinstance(other, Settings):
+ return not self == other
+ else:
+ return NotImplemented
+
+
+def _validate_setting(setting, value): # noqa: C901
+ """
+ Confirms that a specific setting has a well-formed value. If the setting is
+ invalid, returns an error code. Otherwise, returns 0 (NO_ERROR).
+ """
+ if setting == SettingCodes.ENABLE_PUSH:
+ if value not in (0, 1):
+ return ErrorCodes.PROTOCOL_ERROR
+ elif setting == SettingCodes.INITIAL_WINDOW_SIZE:
+ if not 0 <= value <= 2147483647: # 2^31 - 1
+ return ErrorCodes.FLOW_CONTROL_ERROR
+ elif setting == SettingCodes.MAX_FRAME_SIZE:
+ if not 16384 <= value <= 16777215: # 2^14 and 2^24 - 1
+ return ErrorCodes.PROTOCOL_ERROR
+ elif setting == SettingCodes.MAX_HEADER_LIST_SIZE:
+ if value < 0:
+ return ErrorCodes.PROTOCOL_ERROR
+ elif setting == SettingCodes.ENABLE_CONNECT_PROTOCOL:
+ if value not in (0, 1):
+ return ErrorCodes.PROTOCOL_ERROR
+
+ return 0
diff --git a/.venv/lib/python3.9/site-packages/h2/stream.py b/.venv/lib/python3.9/site-packages/h2/stream.py
new file mode 100644
index 0000000..3c29b24
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2/stream.py
@@ -0,0 +1,1371 @@
+# -*- coding: utf-8 -*-
+"""
+h2/stream
+~~~~~~~~~
+
+An implementation of a HTTP/2 stream.
+"""
+from enum import Enum, IntEnum
+from hpack import HeaderTuple
+from hyperframe.frame import (
+ HeadersFrame, ContinuationFrame, DataFrame, WindowUpdateFrame,
+ RstStreamFrame, PushPromiseFrame, AltSvcFrame
+)
+
+from .errors import ErrorCodes, _error_code_from_int
+from .events import (
+ RequestReceived, ResponseReceived, DataReceived, WindowUpdated,
+ StreamEnded, PushedStreamReceived, StreamReset, TrailersReceived,
+ InformationalResponseReceived, AlternativeServiceAvailable,
+ _ResponseSent, _RequestSent, _TrailersSent, _PushedRequestSent
+)
+from .exceptions import (
+ ProtocolError, StreamClosedError, InvalidBodyLengthError, FlowControlError
+)
+from .utilities import (
+ guard_increment_window, is_informational_response, authority_from_headers,
+ validate_headers, validate_outbound_headers, normalize_outbound_headers,
+ HeaderValidationFlags, extract_method_header, normalize_inbound_headers
+)
+from .windows import WindowManager
+
+
+class StreamState(IntEnum):
+ IDLE = 0
+ RESERVED_REMOTE = 1
+ RESERVED_LOCAL = 2
+ OPEN = 3
+ HALF_CLOSED_REMOTE = 4
+ HALF_CLOSED_LOCAL = 5
+ CLOSED = 6
+
+
+class StreamInputs(Enum):
+ SEND_HEADERS = 0
+ SEND_PUSH_PROMISE = 1
+ SEND_RST_STREAM = 2
+ SEND_DATA = 3
+ SEND_WINDOW_UPDATE = 4
+ SEND_END_STREAM = 5
+ RECV_HEADERS = 6
+ RECV_PUSH_PROMISE = 7
+ RECV_RST_STREAM = 8
+ RECV_DATA = 9
+ RECV_WINDOW_UPDATE = 10
+ RECV_END_STREAM = 11
+ RECV_CONTINUATION = 12 # Added in 2.0.0
+ SEND_INFORMATIONAL_HEADERS = 13 # Added in 2.2.0
+ RECV_INFORMATIONAL_HEADERS = 14 # Added in 2.2.0
+ SEND_ALTERNATIVE_SERVICE = 15 # Added in 2.3.0
+ RECV_ALTERNATIVE_SERVICE = 16 # Added in 2.3.0
+ UPGRADE_CLIENT = 17 # Added 2.3.0
+ UPGRADE_SERVER = 18 # Added 2.3.0
+
+
+class StreamClosedBy(Enum):
+ SEND_END_STREAM = 0
+ RECV_END_STREAM = 1
+ SEND_RST_STREAM = 2
+ RECV_RST_STREAM = 3
+
+
+# This array is initialized once, and is indexed by the stream states above.
+# It indicates whether a stream in the given state is open. The reason we do
+# this is that we potentially check whether a stream in a given state is open
+# quite frequently: given that we check so often, we should do so in the
+# fastest and most performant way possible.
+STREAM_OPEN = [False for _ in range(0, len(StreamState))]
+STREAM_OPEN[StreamState.OPEN] = True
+STREAM_OPEN[StreamState.HALF_CLOSED_LOCAL] = True
+STREAM_OPEN[StreamState.HALF_CLOSED_REMOTE] = True
+
+
+class H2StreamStateMachine:
+ """
+ A single HTTP/2 stream state machine.
+
+ This stream object implements basically the state machine described in
+ RFC 7540 section 5.1.
+
+ :param stream_id: The stream ID of this stream. This is stored primarily
+ for logging purposes.
+ """
+ def __init__(self, stream_id):
+ self.state = StreamState.IDLE
+ self.stream_id = stream_id
+
+ #: Whether this peer is the client side of this stream.
+ self.client = None
+
+ # Whether trailers have been sent/received on this stream or not.
+ self.headers_sent = None
+ self.trailers_sent = None
+ self.headers_received = None
+ self.trailers_received = None
+
+ # How the stream was closed. One of StreamClosedBy.
+ self.stream_closed_by = None
+
+ def process_input(self, input_):
+ """
+ Process a specific input in the state machine.
+ """
+ if not isinstance(input_, StreamInputs):
+ raise ValueError("Input must be an instance of StreamInputs")
+
+ try:
+ func, target_state = _transitions[(self.state, input_)]
+ except KeyError:
+ old_state = self.state
+ self.state = StreamState.CLOSED
+ raise ProtocolError(
+ "Invalid input %s in state %s" % (input_, old_state)
+ )
+ else:
+ previous_state = self.state
+ self.state = target_state
+ if func is not None:
+ try:
+ return func(self, previous_state)
+ except ProtocolError:
+ self.state = StreamState.CLOSED
+ raise
+ except AssertionError as e: # pragma: no cover
+ self.state = StreamState.CLOSED
+ raise ProtocolError(e)
+
+ return []
+
+ def request_sent(self, previous_state):
+ """
+ Fires when a request is sent.
+ """
+ self.client = True
+ self.headers_sent = True
+ event = _RequestSent()
+
+ return [event]
+
+ def response_sent(self, previous_state):
+ """
+ Fires when something that should be a response is sent. This 'response'
+ may actually be trailers.
+ """
+ if not self.headers_sent:
+ if self.client is True or self.client is None:
+ raise ProtocolError("Client cannot send responses.")
+ self.headers_sent = True
+ event = _ResponseSent()
+ else:
+ assert not self.trailers_sent
+ self.trailers_sent = True
+ event = _TrailersSent()
+
+ return [event]
+
+ def request_received(self, previous_state):
+ """
+ Fires when a request is received.
+ """
+ assert not self.headers_received
+ assert not self.trailers_received
+
+ self.client = False
+ self.headers_received = True
+ event = RequestReceived()
+
+ event.stream_id = self.stream_id
+ return [event]
+
+ def response_received(self, previous_state):
+ """
+ Fires when a response is received. Also disambiguates between responses
+ and trailers.
+ """
+ if not self.headers_received:
+ assert self.client is True
+ self.headers_received = True
+ event = ResponseReceived()
+ else:
+ assert not self.trailers_received
+ self.trailers_received = True
+ event = TrailersReceived()
+
+ event.stream_id = self.stream_id
+ return [event]
+
+ def data_received(self, previous_state):
+ """
+ Fires when data is received.
+ """
+ if not self.headers_received:
+ raise ProtocolError("cannot receive data before headers")
+ event = DataReceived()
+ event.stream_id = self.stream_id
+ return [event]
+
+ def window_updated(self, previous_state):
+ """
+ Fires when a window update frame is received.
+ """
+ event = WindowUpdated()
+ event.stream_id = self.stream_id
+ return [event]
+
+ def stream_half_closed(self, previous_state):
+ """
+ Fires when an END_STREAM flag is received in the OPEN state,
+ transitioning this stream to a HALF_CLOSED_REMOTE state.
+ """
+ event = StreamEnded()
+ event.stream_id = self.stream_id
+ return [event]
+
+ def stream_ended(self, previous_state):
+ """
+ Fires when a stream is cleanly ended.
+ """
+ self.stream_closed_by = StreamClosedBy.RECV_END_STREAM
+ event = StreamEnded()
+ event.stream_id = self.stream_id
+ return [event]
+
+ def stream_reset(self, previous_state):
+ """
+ Fired when a stream is forcefully reset.
+ """
+ self.stream_closed_by = StreamClosedBy.RECV_RST_STREAM
+ event = StreamReset()
+ event.stream_id = self.stream_id
+ return [event]
+
+ def send_new_pushed_stream(self, previous_state):
+ """
+ Fires on the newly pushed stream, when pushed by the local peer.
+
+ No event here, but definitionally this peer must be a server.
+ """
+ assert self.client is None
+ self.client = False
+ self.headers_received = True
+ return []
+
+ def recv_new_pushed_stream(self, previous_state):
+ """
+ Fires on the newly pushed stream, when pushed by the remote peer.
+
+ No event here, but definitionally this peer must be a client.
+ """
+ assert self.client is None
+ self.client = True
+ self.headers_sent = True
+ return []
+
+ def send_push_promise(self, previous_state):
+ """
+ Fires on the already-existing stream when a PUSH_PROMISE frame is sent.
+ We may only send PUSH_PROMISE frames if we're a server.
+ """
+ if self.client is True:
+ raise ProtocolError("Cannot push streams from client peers.")
+
+ event = _PushedRequestSent()
+ return [event]
+
+ def recv_push_promise(self, previous_state):
+ """
+ Fires on the already-existing stream when a PUSH_PROMISE frame is
+ received. We may only receive PUSH_PROMISE frames if we're a client.
+
+ Fires a PushedStreamReceived event.
+ """
+ if not self.client:
+ if self.client is None: # pragma: no cover
+ msg = "Idle streams cannot receive pushes"
+ else: # pragma: no cover
+ msg = "Cannot receive pushed streams as a server"
+ raise ProtocolError(msg)
+
+ event = PushedStreamReceived()
+ event.parent_stream_id = self.stream_id
+ return [event]
+
+ def send_end_stream(self, previous_state):
+ """
+ Called when an attempt is made to send END_STREAM in the
+ HALF_CLOSED_REMOTE state.
+ """
+ self.stream_closed_by = StreamClosedBy.SEND_END_STREAM
+
+ def send_reset_stream(self, previous_state):
+ """
+ Called when an attempt is made to send RST_STREAM in a non-closed
+ stream state.
+ """
+ self.stream_closed_by = StreamClosedBy.SEND_RST_STREAM
+
+ def reset_stream_on_error(self, previous_state):
+ """
+ Called when we need to forcefully emit another RST_STREAM frame on
+ behalf of the state machine.
+
+ If this is the first time we've done this, we should also hang an event
+ off the StreamClosedError so that the user can be informed. We know
+ it's the first time we've done this if the stream is currently in a
+ state other than CLOSED.
+ """
+ self.stream_closed_by = StreamClosedBy.SEND_RST_STREAM
+
+ error = StreamClosedError(self.stream_id)
+
+ event = StreamReset()
+ event.stream_id = self.stream_id
+ event.error_code = ErrorCodes.STREAM_CLOSED
+ event.remote_reset = False
+ error._events = [event]
+ raise error
+
+ def recv_on_closed_stream(self, previous_state):
+ """
+ Called when an unexpected frame is received on an already-closed
+ stream.
+
+ An endpoint that receives an unexpected frame should treat it as
+ a stream error or connection error with type STREAM_CLOSED, depending
+ on the specific frame. The error handling is done at a higher level:
+ this just raises the appropriate error.
+ """
+ raise StreamClosedError(self.stream_id)
+
+ def send_on_closed_stream(self, previous_state):
+ """
+ Called when an attempt is made to send data on an already-closed
+ stream.
+
+ This essentially overrides the standard logic by throwing a
+ more-specific error: StreamClosedError. This is a ProtocolError, so it
+ matches the standard API of the state machine, but provides more detail
+ to the user.
+ """
+ raise StreamClosedError(self.stream_id)
+
+ def recv_push_on_closed_stream(self, previous_state):
+ """
+ Called when a PUSH_PROMISE frame is received on a full stop
+ stream.
+
+ If the stream was closed by us sending a RST_STREAM frame, then we
+ presume that the PUSH_PROMISE was in flight when we reset the parent
+ stream. Rathen than accept the new stream, we just reset it.
+ Otherwise, we should call this a PROTOCOL_ERROR: pushing a stream on a
+ naturally closed stream is a real problem because it creates a brand
+ new stream that the remote peer now believes exists.
+ """
+ assert self.stream_closed_by is not None
+
+ if self.stream_closed_by == StreamClosedBy.SEND_RST_STREAM:
+ raise StreamClosedError(self.stream_id)
+ else:
+ raise ProtocolError("Attempted to push on closed stream.")
+
+ def send_push_on_closed_stream(self, previous_state):
+ """
+ Called when an attempt is made to push on an already-closed stream.
+
+ This essentially overrides the standard logic by providing a more
+ useful error message. It's necessary because simply indicating that the
+ stream is closed is not enough: there is now a new stream that is not
+ allowed to be there. The only recourse is to tear the whole connection
+ down.
+ """
+ raise ProtocolError("Attempted to push on closed stream.")
+
+ def send_informational_response(self, previous_state):
+ """
+ Called when an informational header block is sent (that is, a block
+ where the :status header has a 1XX value).
+
+ Only enforces that these are sent *before* final headers are sent.
+ """
+ if self.headers_sent:
+ raise ProtocolError("Information response after final response")
+
+ event = _ResponseSent()
+ return [event]
+
+ def recv_informational_response(self, previous_state):
+ """
+ Called when an informational header block is received (that is, a block
+ where the :status header has a 1XX value).
+ """
+ if self.headers_received:
+ raise ProtocolError("Informational response after final response")
+
+ event = InformationalResponseReceived()
+ event.stream_id = self.stream_id
+ return [event]
+
+ def recv_alt_svc(self, previous_state):
+ """
+ Called when receiving an ALTSVC frame.
+
+ RFC 7838 allows us to receive ALTSVC frames at any stream state, which
+ is really absurdly overzealous. For that reason, we want to limit the
+ states in which we can actually receive it. It's really only sensible
+ to receive it after we've sent our own headers and before the server
+ has sent its header block: the server can't guarantee that we have any
+ state around after it completes its header block, and the server
+ doesn't know what origin we're talking about before we've sent ours.
+
+ For that reason, this function applies a few extra checks on both state
+ and some of the little state variables we keep around. If those suggest
+ an unreasonable situation for the ALTSVC frame to have been sent in,
+ we quietly ignore it (as RFC 7838 suggests).
+
+ This function is also *not* always called by the state machine. In some
+ states (IDLE, RESERVED_LOCAL, CLOSED) we don't bother to call it,
+ because we know the frame cannot be valid in that state (IDLE because
+ the server cannot know what origin the stream applies to, CLOSED
+ because the server cannot assume we still have state around,
+ RESERVED_LOCAL because by definition if we're in the RESERVED_LOCAL
+ state then *we* are the server).
+ """
+ # Servers can't receive ALTSVC frames, but RFC 7838 tells us to ignore
+ # them.
+ if self.client is False:
+ return []
+
+ # If we've received the response headers from the server they can't
+ # guarantee we still have any state around. Other implementations
+ # (like nghttp2) ignore ALTSVC in this state, so we will too.
+ if self.headers_received:
+ return []
+
+ # Otherwise, this is a sensible enough frame to have received. Return
+ # the event and let it get populated.
+ return [AlternativeServiceAvailable()]
+
+ def send_alt_svc(self, previous_state):
+ """
+ Called when sending an ALTSVC frame on this stream.
+
+ For consistency with the restrictions we apply on receiving ALTSVC
+ frames in ``recv_alt_svc``, we want to restrict when users can send
+ ALTSVC frames to the situations when we ourselves would accept them.
+
+ That means: when we are a server, when we have received the request
+ headers, and when we have not yet sent our own response headers.
+ """
+ # We should not send ALTSVC after we've sent response headers, as the
+ # client may have disposed of its state.
+ if self.headers_sent:
+ raise ProtocolError(
+ "Cannot send ALTSVC after sending response headers."
+ )
+
+ return
+
+
+# STATE MACHINE
+#
+# The stream state machine is defined here to avoid the need to allocate it
+# repeatedly for each stream. It cannot be defined in the stream class because
+# it needs to be able to reference the callbacks defined on the class, but
+# because Python's scoping rules are weird the class object is not actually in
+# scope during the body of the class object.
+#
+# For the sake of clarity, we reproduce the RFC 7540 state machine here:
+#
+# +--------+
+# send PP | | recv PP
+# ,--------| idle |--------.
+# / | | \
+# v +--------+ v
+# +----------+ | +----------+
+# | | | send H / | |
+# ,------| reserved | | recv H | reserved |------.
+# | | (local) | | | (remote) | |
+# | +----------+ v +----------+ |
+# | | +--------+ | |
+# | | recv ES | | send ES | |
+# | send H | ,-------| open |-------. | recv H |
+# | | / | | \ | |
+# | v v +--------+ v v |
+# | +----------+ | +----------+ |
+# | | half | | | half | |
+# | | closed | | send R / | closed | |
+# | | (remote) | | recv R | (local) | |
+# | +----------+ | +----------+ |
+# | | | | |
+# | | send ES / | recv ES / | |
+# | | send R / v send R / | |
+# | | recv R +--------+ recv R | |
+# | send R / `----------->| |<-----------' send R / |
+# | recv R | closed | recv R |
+# `----------------------->| |<----------------------'
+# +--------+
+#
+# send: endpoint sends this frame
+# recv: endpoint receives this frame
+#
+# H: HEADERS frame (with implied CONTINUATIONs)
+# PP: PUSH_PROMISE frame (with implied CONTINUATIONs)
+# ES: END_STREAM flag
+# R: RST_STREAM frame
+#
+# For the purposes of this state machine we treat HEADERS and their
+# associated CONTINUATION frames as a single jumbo frame. The protocol
+# allows/requires this by preventing other frames from being interleved in
+# between HEADERS/CONTINUATION frames. However, if a CONTINUATION frame is
+# received without a prior HEADERS frame, it *will* be passed to this state
+# machine. The state machine should always reject that frame, either as an
+# invalid transition or because the stream is closed.
+#
+# There is a confusing relationship around PUSH_PROMISE frames. The state
+# machine above considers them to be frames belonging to the new stream,
+# which is *somewhat* true. However, they are sent with the stream ID of
+# their related stream, and are only sendable in some cases.
+# For this reason, our state machine implementation below allows for
+# PUSH_PROMISE frames both in the IDLE state (as in the diagram), but also
+# in the OPEN, HALF_CLOSED_LOCAL, and HALF_CLOSED_REMOTE states.
+# Essentially, for hyper-h2, PUSH_PROMISE frames are effectively sent on
+# two streams.
+#
+# The _transitions dictionary contains a mapping of tuples of
+# (state, input) to tuples of (side_effect_function, end_state). This
+# map contains all allowed transitions: anything not in this map is
+# invalid and immediately causes a transition to ``closed``.
+_transitions = {
+ # State: idle
+ (StreamState.IDLE, StreamInputs.SEND_HEADERS):
+ (H2StreamStateMachine.request_sent, StreamState.OPEN),
+ (StreamState.IDLE, StreamInputs.RECV_HEADERS):
+ (H2StreamStateMachine.request_received, StreamState.OPEN),
+ (StreamState.IDLE, StreamInputs.RECV_DATA):
+ (H2StreamStateMachine.reset_stream_on_error, StreamState.CLOSED),
+ (StreamState.IDLE, StreamInputs.SEND_PUSH_PROMISE):
+ (H2StreamStateMachine.send_new_pushed_stream,
+ StreamState.RESERVED_LOCAL),
+ (StreamState.IDLE, StreamInputs.RECV_PUSH_PROMISE):
+ (H2StreamStateMachine.recv_new_pushed_stream,
+ StreamState.RESERVED_REMOTE),
+ (StreamState.IDLE, StreamInputs.RECV_ALTERNATIVE_SERVICE):
+ (None, StreamState.IDLE),
+ (StreamState.IDLE, StreamInputs.UPGRADE_CLIENT):
+ (H2StreamStateMachine.request_sent, StreamState.HALF_CLOSED_LOCAL),
+ (StreamState.IDLE, StreamInputs.UPGRADE_SERVER):
+ (H2StreamStateMachine.request_received,
+ StreamState.HALF_CLOSED_REMOTE),
+
+ # State: reserved local
+ (StreamState.RESERVED_LOCAL, StreamInputs.SEND_HEADERS):
+ (H2StreamStateMachine.response_sent, StreamState.HALF_CLOSED_REMOTE),
+ (StreamState.RESERVED_LOCAL, StreamInputs.RECV_DATA):
+ (H2StreamStateMachine.reset_stream_on_error, StreamState.CLOSED),
+ (StreamState.RESERVED_LOCAL, StreamInputs.SEND_WINDOW_UPDATE):
+ (None, StreamState.RESERVED_LOCAL),
+ (StreamState.RESERVED_LOCAL, StreamInputs.RECV_WINDOW_UPDATE):
+ (H2StreamStateMachine.window_updated, StreamState.RESERVED_LOCAL),
+ (StreamState.RESERVED_LOCAL, StreamInputs.SEND_RST_STREAM):
+ (H2StreamStateMachine.send_reset_stream, StreamState.CLOSED),
+ (StreamState.RESERVED_LOCAL, StreamInputs.RECV_RST_STREAM):
+ (H2StreamStateMachine.stream_reset, StreamState.CLOSED),
+ (StreamState.RESERVED_LOCAL, StreamInputs.SEND_ALTERNATIVE_SERVICE):
+ (H2StreamStateMachine.send_alt_svc, StreamState.RESERVED_LOCAL),
+ (StreamState.RESERVED_LOCAL, StreamInputs.RECV_ALTERNATIVE_SERVICE):
+ (None, StreamState.RESERVED_LOCAL),
+
+ # State: reserved remote
+ (StreamState.RESERVED_REMOTE, StreamInputs.RECV_HEADERS):
+ (H2StreamStateMachine.response_received,
+ StreamState.HALF_CLOSED_LOCAL),
+ (StreamState.RESERVED_REMOTE, StreamInputs.RECV_DATA):
+ (H2StreamStateMachine.reset_stream_on_error, StreamState.CLOSED),
+ (StreamState.RESERVED_REMOTE, StreamInputs.SEND_WINDOW_UPDATE):
+ (None, StreamState.RESERVED_REMOTE),
+ (StreamState.RESERVED_REMOTE, StreamInputs.RECV_WINDOW_UPDATE):
+ (H2StreamStateMachine.window_updated, StreamState.RESERVED_REMOTE),
+ (StreamState.RESERVED_REMOTE, StreamInputs.SEND_RST_STREAM):
+ (H2StreamStateMachine.send_reset_stream, StreamState.CLOSED),
+ (StreamState.RESERVED_REMOTE, StreamInputs.RECV_RST_STREAM):
+ (H2StreamStateMachine.stream_reset, StreamState.CLOSED),
+ (StreamState.RESERVED_REMOTE, StreamInputs.RECV_ALTERNATIVE_SERVICE):
+ (H2StreamStateMachine.recv_alt_svc, StreamState.RESERVED_REMOTE),
+
+ # State: open
+ (StreamState.OPEN, StreamInputs.SEND_HEADERS):
+ (H2StreamStateMachine.response_sent, StreamState.OPEN),
+ (StreamState.OPEN, StreamInputs.RECV_HEADERS):
+ (H2StreamStateMachine.response_received, StreamState.OPEN),
+ (StreamState.OPEN, StreamInputs.SEND_DATA):
+ (None, StreamState.OPEN),
+ (StreamState.OPEN, StreamInputs.RECV_DATA):
+ (H2StreamStateMachine.data_received, StreamState.OPEN),
+ (StreamState.OPEN, StreamInputs.SEND_END_STREAM):
+ (None, StreamState.HALF_CLOSED_LOCAL),
+ (StreamState.OPEN, StreamInputs.RECV_END_STREAM):
+ (H2StreamStateMachine.stream_half_closed,
+ StreamState.HALF_CLOSED_REMOTE),
+ (StreamState.OPEN, StreamInputs.SEND_WINDOW_UPDATE):
+ (None, StreamState.OPEN),
+ (StreamState.OPEN, StreamInputs.RECV_WINDOW_UPDATE):
+ (H2StreamStateMachine.window_updated, StreamState.OPEN),
+ (StreamState.OPEN, StreamInputs.SEND_RST_STREAM):
+ (H2StreamStateMachine.send_reset_stream, StreamState.CLOSED),
+ (StreamState.OPEN, StreamInputs.RECV_RST_STREAM):
+ (H2StreamStateMachine.stream_reset, StreamState.CLOSED),
+ (StreamState.OPEN, StreamInputs.SEND_PUSH_PROMISE):
+ (H2StreamStateMachine.send_push_promise, StreamState.OPEN),
+ (StreamState.OPEN, StreamInputs.RECV_PUSH_PROMISE):
+ (H2StreamStateMachine.recv_push_promise, StreamState.OPEN),
+ (StreamState.OPEN, StreamInputs.SEND_INFORMATIONAL_HEADERS):
+ (H2StreamStateMachine.send_informational_response, StreamState.OPEN),
+ (StreamState.OPEN, StreamInputs.RECV_INFORMATIONAL_HEADERS):
+ (H2StreamStateMachine.recv_informational_response, StreamState.OPEN),
+ (StreamState.OPEN, StreamInputs.SEND_ALTERNATIVE_SERVICE):
+ (H2StreamStateMachine.send_alt_svc, StreamState.OPEN),
+ (StreamState.OPEN, StreamInputs.RECV_ALTERNATIVE_SERVICE):
+ (H2StreamStateMachine.recv_alt_svc, StreamState.OPEN),
+
+ # State: half-closed remote
+ (StreamState.HALF_CLOSED_REMOTE, StreamInputs.SEND_HEADERS):
+ (H2StreamStateMachine.response_sent, StreamState.HALF_CLOSED_REMOTE),
+ (StreamState.HALF_CLOSED_REMOTE, StreamInputs.RECV_HEADERS):
+ (H2StreamStateMachine.reset_stream_on_error, StreamState.CLOSED),
+ (StreamState.HALF_CLOSED_REMOTE, StreamInputs.SEND_DATA):
+ (None, StreamState.HALF_CLOSED_REMOTE),
+ (StreamState.HALF_CLOSED_REMOTE, StreamInputs.RECV_DATA):
+ (H2StreamStateMachine.reset_stream_on_error, StreamState.CLOSED),
+ (StreamState.HALF_CLOSED_REMOTE, StreamInputs.SEND_END_STREAM):
+ (H2StreamStateMachine.send_end_stream, StreamState.CLOSED),
+ (StreamState.HALF_CLOSED_REMOTE, StreamInputs.SEND_WINDOW_UPDATE):
+ (None, StreamState.HALF_CLOSED_REMOTE),
+ (StreamState.HALF_CLOSED_REMOTE, StreamInputs.RECV_WINDOW_UPDATE):
+ (H2StreamStateMachine.window_updated, StreamState.HALF_CLOSED_REMOTE),
+ (StreamState.HALF_CLOSED_REMOTE, StreamInputs.SEND_RST_STREAM):
+ (H2StreamStateMachine.send_reset_stream, StreamState.CLOSED),
+ (StreamState.HALF_CLOSED_REMOTE, StreamInputs.RECV_RST_STREAM):
+ (H2StreamStateMachine.stream_reset, StreamState.CLOSED),
+ (StreamState.HALF_CLOSED_REMOTE, StreamInputs.SEND_PUSH_PROMISE):
+ (H2StreamStateMachine.send_push_promise,
+ StreamState.HALF_CLOSED_REMOTE),
+ (StreamState.HALF_CLOSED_REMOTE, StreamInputs.RECV_PUSH_PROMISE):
+ (H2StreamStateMachine.reset_stream_on_error, StreamState.CLOSED),
+ (StreamState.HALF_CLOSED_REMOTE, StreamInputs.SEND_INFORMATIONAL_HEADERS):
+ (H2StreamStateMachine.send_informational_response,
+ StreamState.HALF_CLOSED_REMOTE),
+ (StreamState.HALF_CLOSED_REMOTE, StreamInputs.SEND_ALTERNATIVE_SERVICE):
+ (H2StreamStateMachine.send_alt_svc, StreamState.HALF_CLOSED_REMOTE),
+ (StreamState.HALF_CLOSED_REMOTE, StreamInputs.RECV_ALTERNATIVE_SERVICE):
+ (H2StreamStateMachine.recv_alt_svc, StreamState.HALF_CLOSED_REMOTE),
+
+ # State: half-closed local
+ (StreamState.HALF_CLOSED_LOCAL, StreamInputs.RECV_HEADERS):
+ (H2StreamStateMachine.response_received,
+ StreamState.HALF_CLOSED_LOCAL),
+ (StreamState.HALF_CLOSED_LOCAL, StreamInputs.RECV_DATA):
+ (H2StreamStateMachine.data_received, StreamState.HALF_CLOSED_LOCAL),
+ (StreamState.HALF_CLOSED_LOCAL, StreamInputs.RECV_END_STREAM):
+ (H2StreamStateMachine.stream_ended, StreamState.CLOSED),
+ (StreamState.HALF_CLOSED_LOCAL, StreamInputs.SEND_WINDOW_UPDATE):
+ (None, StreamState.HALF_CLOSED_LOCAL),
+ (StreamState.HALF_CLOSED_LOCAL, StreamInputs.RECV_WINDOW_UPDATE):
+ (H2StreamStateMachine.window_updated, StreamState.HALF_CLOSED_LOCAL),
+ (StreamState.HALF_CLOSED_LOCAL, StreamInputs.SEND_RST_STREAM):
+ (H2StreamStateMachine.send_reset_stream, StreamState.CLOSED),
+ (StreamState.HALF_CLOSED_LOCAL, StreamInputs.RECV_RST_STREAM):
+ (H2StreamStateMachine.stream_reset, StreamState.CLOSED),
+ (StreamState.HALF_CLOSED_LOCAL, StreamInputs.RECV_PUSH_PROMISE):
+ (H2StreamStateMachine.recv_push_promise,
+ StreamState.HALF_CLOSED_LOCAL),
+ (StreamState.HALF_CLOSED_LOCAL, StreamInputs.RECV_INFORMATIONAL_HEADERS):
+ (H2StreamStateMachine.recv_informational_response,
+ StreamState.HALF_CLOSED_LOCAL),
+ (StreamState.HALF_CLOSED_LOCAL, StreamInputs.SEND_ALTERNATIVE_SERVICE):
+ (H2StreamStateMachine.send_alt_svc, StreamState.HALF_CLOSED_LOCAL),
+ (StreamState.HALF_CLOSED_LOCAL, StreamInputs.RECV_ALTERNATIVE_SERVICE):
+ (H2StreamStateMachine.recv_alt_svc, StreamState.HALF_CLOSED_LOCAL),
+
+ # State: closed
+ (StreamState.CLOSED, StreamInputs.RECV_END_STREAM):
+ (None, StreamState.CLOSED),
+ (StreamState.CLOSED, StreamInputs.RECV_ALTERNATIVE_SERVICE):
+ (None, StreamState.CLOSED),
+
+ # RFC 7540 Section 5.1 defines how the end point should react when
+ # receiving a frame on a closed stream with the following statements:
+ #
+ # > An endpoint that receives any frame other than PRIORITY after receiving
+ # > a RST_STREAM MUST treat that as a stream error of type STREAM_CLOSED.
+ # > An endpoint that receives any frames after receiving a frame with the
+ # > END_STREAM flag set MUST treat that as a connection error of type
+ # > STREAM_CLOSED.
+ (StreamState.CLOSED, StreamInputs.RECV_HEADERS):
+ (H2StreamStateMachine.recv_on_closed_stream, StreamState.CLOSED),
+ (StreamState.CLOSED, StreamInputs.RECV_DATA):
+ (H2StreamStateMachine.recv_on_closed_stream, StreamState.CLOSED),
+
+ # > WINDOW_UPDATE or RST_STREAM frames can be received in this state
+ # > for a short period after a DATA or HEADERS frame containing a
+ # > END_STREAM flag is sent, as instructed in RFC 7540 Section 5.1. But we
+ # > don't have access to a clock so we just always allow it.
+ (StreamState.CLOSED, StreamInputs.RECV_WINDOW_UPDATE):
+ (None, StreamState.CLOSED),
+ (StreamState.CLOSED, StreamInputs.RECV_RST_STREAM):
+ (None, StreamState.CLOSED),
+
+ # > A receiver MUST treat the receipt of a PUSH_PROMISE on a stream that is
+ # > neither "open" nor "half-closed (local)" as a connection error of type
+ # > PROTOCOL_ERROR.
+ (StreamState.CLOSED, StreamInputs.RECV_PUSH_PROMISE):
+ (H2StreamStateMachine.recv_push_on_closed_stream, StreamState.CLOSED),
+
+ # Also, users should be forbidden from sending on closed streams.
+ (StreamState.CLOSED, StreamInputs.SEND_HEADERS):
+ (H2StreamStateMachine.send_on_closed_stream, StreamState.CLOSED),
+ (StreamState.CLOSED, StreamInputs.SEND_PUSH_PROMISE):
+ (H2StreamStateMachine.send_push_on_closed_stream, StreamState.CLOSED),
+ (StreamState.CLOSED, StreamInputs.SEND_RST_STREAM):
+ (H2StreamStateMachine.send_on_closed_stream, StreamState.CLOSED),
+ (StreamState.CLOSED, StreamInputs.SEND_DATA):
+ (H2StreamStateMachine.send_on_closed_stream, StreamState.CLOSED),
+ (StreamState.CLOSED, StreamInputs.SEND_WINDOW_UPDATE):
+ (H2StreamStateMachine.send_on_closed_stream, StreamState.CLOSED),
+ (StreamState.CLOSED, StreamInputs.SEND_END_STREAM):
+ (H2StreamStateMachine.send_on_closed_stream, StreamState.CLOSED),
+}
+
+
+class H2Stream:
+ """
+ A low-level HTTP/2 stream object. This handles building and receiving
+ frames and maintains per-stream state.
+
+ This wraps a HTTP/2 Stream state machine implementation, ensuring that
+ frames can only be sent/received when the stream is in a valid state.
+ Attempts to create frames that cannot be sent will raise a
+ ``ProtocolError``.
+ """
+ def __init__(self,
+ stream_id,
+ config,
+ inbound_window_size,
+ outbound_window_size):
+ self.state_machine = H2StreamStateMachine(stream_id)
+ self.stream_id = stream_id
+ self.max_outbound_frame_size = None
+ self.request_method = None
+
+ # The current value of the outbound stream flow control window
+ self.outbound_flow_control_window = outbound_window_size
+
+ # The flow control manager.
+ self._inbound_window_manager = WindowManager(inbound_window_size)
+
+ # The expected content length, if any.
+ self._expected_content_length = None
+
+ # The actual received content length. Always tracked.
+ self._actual_content_length = 0
+
+ # The authority we believe this stream belongs to.
+ self._authority = None
+
+ # The configuration for this stream.
+ self.config = config
+
+ def __repr__(self):
+ return "<%s id:%d state:%r>" % (
+ type(self).__name__,
+ self.stream_id,
+ self.state_machine.state
+ )
+
+ @property
+ def inbound_flow_control_window(self):
+ """
+ The size of the inbound flow control window for the stream. This is
+ rarely publicly useful: instead, use :meth:`remote_flow_control_window
+ `. This shortcut is
+ largely present to provide a shortcut to this data.
+ """
+ return self._inbound_window_manager.current_window_size
+
+ @property
+ def open(self):
+ """
+ Whether the stream is 'open' in any sense: that is, whether it counts
+ against the number of concurrent streams.
+ """
+ # RFC 7540 Section 5.1.2 defines 'open' for this purpose to mean either
+ # the OPEN state or either of the HALF_CLOSED states. Perplexingly,
+ # this excludes the reserved states.
+ # For more detail on why we're doing this in this slightly weird way,
+ # see the comment on ``STREAM_OPEN`` at the top of the file.
+ return STREAM_OPEN[self.state_machine.state]
+
+ @property
+ def closed(self):
+ """
+ Whether the stream is closed.
+ """
+ return self.state_machine.state == StreamState.CLOSED
+
+ @property
+ def closed_by(self):
+ """
+ Returns how the stream was closed, as one of StreamClosedBy.
+ """
+ return self.state_machine.stream_closed_by
+
+ def upgrade(self, client_side):
+ """
+ Called by the connection to indicate that this stream is the initial
+ request/response of an upgraded connection. Places the stream into an
+ appropriate state.
+ """
+ self.config.logger.debug("Upgrading %r", self)
+
+ assert self.stream_id == 1
+ input_ = (
+ StreamInputs.UPGRADE_CLIENT if client_side
+ else StreamInputs.UPGRADE_SERVER
+ )
+
+ # This may return events, we deliberately don't want them.
+ self.state_machine.process_input(input_)
+ return
+
+ def send_headers(self, headers, encoder, end_stream=False):
+ """
+ Returns a list of HEADERS/CONTINUATION frames to emit as either headers
+ or trailers.
+ """
+ self.config.logger.debug("Send headers %s on %r", headers, self)
+
+ # Because encoding headers makes an irreversible change to the header
+ # compression context, we make the state transition before we encode
+ # them.
+
+ # First, check if we're a client. If we are, no problem: if we aren't,
+ # we need to scan the header block to see if this is an informational
+ # response.
+ input_ = StreamInputs.SEND_HEADERS
+ if ((not self.state_machine.client) and
+ is_informational_response(headers)):
+ if end_stream:
+ raise ProtocolError(
+ "Cannot set END_STREAM on informational responses."
+ )
+
+ input_ = StreamInputs.SEND_INFORMATIONAL_HEADERS
+
+ events = self.state_machine.process_input(input_)
+
+ hf = HeadersFrame(self.stream_id)
+ hdr_validation_flags = self._build_hdr_validation_flags(events)
+ frames = self._build_headers_frames(
+ headers, encoder, hf, hdr_validation_flags
+ )
+
+ if end_stream:
+ # Not a bug: the END_STREAM flag is valid on the initial HEADERS
+ # frame, not the CONTINUATION frames that follow.
+ self.state_machine.process_input(StreamInputs.SEND_END_STREAM)
+ frames[0].flags.add('END_STREAM')
+
+ if self.state_machine.trailers_sent and not end_stream:
+ raise ProtocolError("Trailers must have END_STREAM set.")
+
+ if self.state_machine.client and self._authority is None:
+ self._authority = authority_from_headers(headers)
+
+ # store request method for _initialize_content_length
+ self.request_method = extract_method_header(headers)
+
+ return frames
+
+ def push_stream_in_band(self, related_stream_id, headers, encoder):
+ """
+ Returns a list of PUSH_PROMISE/CONTINUATION frames to emit as a pushed
+ stream header. Called on the stream that has the PUSH_PROMISE frame
+ sent on it.
+ """
+ self.config.logger.debug("Push stream %r", self)
+
+ # Because encoding headers makes an irreversible change to the header
+ # compression context, we make the state transition *first*.
+
+ events = self.state_machine.process_input(
+ StreamInputs.SEND_PUSH_PROMISE
+ )
+
+ ppf = PushPromiseFrame(self.stream_id)
+ ppf.promised_stream_id = related_stream_id
+ hdr_validation_flags = self._build_hdr_validation_flags(events)
+ frames = self._build_headers_frames(
+ headers, encoder, ppf, hdr_validation_flags
+ )
+
+ return frames
+
+ def locally_pushed(self):
+ """
+ Mark this stream as one that was pushed by this peer. Must be called
+ immediately after initialization. Sends no frames, simply updates the
+ state machine.
+ """
+ # This does not trigger any events.
+ events = self.state_machine.process_input(
+ StreamInputs.SEND_PUSH_PROMISE
+ )
+ assert not events
+ return []
+
+ def send_data(self, data, end_stream=False, pad_length=None):
+ """
+ Prepare some data frames. Optionally end the stream.
+
+ .. warning:: Does not perform flow control checks.
+ """
+ self.config.logger.debug(
+ "Send data on %r with end stream set to %s", self, end_stream
+ )
+
+ self.state_machine.process_input(StreamInputs.SEND_DATA)
+
+ df = DataFrame(self.stream_id)
+ df.data = data
+ if end_stream:
+ self.state_machine.process_input(StreamInputs.SEND_END_STREAM)
+ df.flags.add('END_STREAM')
+ if pad_length is not None:
+ df.flags.add('PADDED')
+ df.pad_length = pad_length
+
+ # Subtract flow_controlled_length to account for possible padding
+ self.outbound_flow_control_window -= df.flow_controlled_length
+ assert self.outbound_flow_control_window >= 0
+
+ return [df]
+
+ def end_stream(self):
+ """
+ End a stream without sending data.
+ """
+ self.config.logger.debug("End stream %r", self)
+
+ self.state_machine.process_input(StreamInputs.SEND_END_STREAM)
+ df = DataFrame(self.stream_id)
+ df.flags.add('END_STREAM')
+ return [df]
+
+ def advertise_alternative_service(self, field_value):
+ """
+ Advertise an RFC 7838 alternative service. The semantics of this are
+ better documented in the ``H2Connection`` class.
+ """
+ self.config.logger.debug(
+ "Advertise alternative service of %r for %r", field_value, self
+ )
+ self.state_machine.process_input(StreamInputs.SEND_ALTERNATIVE_SERVICE)
+ asf = AltSvcFrame(self.stream_id)
+ asf.field = field_value
+ return [asf]
+
+ def increase_flow_control_window(self, increment):
+ """
+ Increase the size of the flow control window for the remote side.
+ """
+ self.config.logger.debug(
+ "Increase flow control window for %r by %d",
+ self, increment
+ )
+ self.state_machine.process_input(StreamInputs.SEND_WINDOW_UPDATE)
+ self._inbound_window_manager.window_opened(increment)
+
+ wuf = WindowUpdateFrame(self.stream_id)
+ wuf.window_increment = increment
+ return [wuf]
+
+ def receive_push_promise_in_band(self,
+ promised_stream_id,
+ headers,
+ header_encoding):
+ """
+ Receives a push promise frame sent on this stream, pushing a remote
+ stream. This is called on the stream that has the PUSH_PROMISE sent
+ on it.
+ """
+ self.config.logger.debug(
+ "Receive Push Promise on %r for remote stream %d",
+ self, promised_stream_id
+ )
+ events = self.state_machine.process_input(
+ StreamInputs.RECV_PUSH_PROMISE
+ )
+ events[0].pushed_stream_id = promised_stream_id
+
+ hdr_validation_flags = self._build_hdr_validation_flags(events)
+ events[0].headers = self._process_received_headers(
+ headers, hdr_validation_flags, header_encoding
+ )
+ return [], events
+
+ def remotely_pushed(self, pushed_headers):
+ """
+ Mark this stream as one that was pushed by the remote peer. Must be
+ called immediately after initialization. Sends no frames, simply
+ updates the state machine.
+ """
+ self.config.logger.debug("%r pushed by remote peer", self)
+ events = self.state_machine.process_input(
+ StreamInputs.RECV_PUSH_PROMISE
+ )
+ self._authority = authority_from_headers(pushed_headers)
+ return [], events
+
+ def receive_headers(self, headers, end_stream, header_encoding):
+ """
+ Receive a set of headers (or trailers).
+ """
+ if is_informational_response(headers):
+ if end_stream:
+ raise ProtocolError(
+ "Cannot set END_STREAM on informational responses"
+ )
+ input_ = StreamInputs.RECV_INFORMATIONAL_HEADERS
+ else:
+ input_ = StreamInputs.RECV_HEADERS
+
+ events = self.state_machine.process_input(input_)
+
+ if end_stream:
+ es_events = self.state_machine.process_input(
+ StreamInputs.RECV_END_STREAM
+ )
+ events[0].stream_ended = es_events[0]
+ events += es_events
+
+ self._initialize_content_length(headers)
+
+ if isinstance(events[0], TrailersReceived):
+ if not end_stream:
+ raise ProtocolError("Trailers must have END_STREAM set")
+
+ hdr_validation_flags = self._build_hdr_validation_flags(events)
+ events[0].headers = self._process_received_headers(
+ headers, hdr_validation_flags, header_encoding
+ )
+ return [], events
+
+ def receive_data(self, data, end_stream, flow_control_len):
+ """
+ Receive some data.
+ """
+ self.config.logger.debug(
+ "Receive data on %r with end stream %s and flow control length "
+ "set to %d", self, end_stream, flow_control_len
+ )
+ events = self.state_machine.process_input(StreamInputs.RECV_DATA)
+ self._inbound_window_manager.window_consumed(flow_control_len)
+ self._track_content_length(len(data), end_stream)
+
+ if end_stream:
+ es_events = self.state_machine.process_input(
+ StreamInputs.RECV_END_STREAM
+ )
+ events[0].stream_ended = es_events[0]
+ events.extend(es_events)
+
+ events[0].data = data
+ events[0].flow_controlled_length = flow_control_len
+ return [], events
+
+ def receive_window_update(self, increment):
+ """
+ Handle a WINDOW_UPDATE increment.
+ """
+ self.config.logger.debug(
+ "Receive Window Update on %r for increment of %d",
+ self, increment
+ )
+ events = self.state_machine.process_input(
+ StreamInputs.RECV_WINDOW_UPDATE
+ )
+ frames = []
+
+ # If we encounter a problem with incrementing the flow control window,
+ # this should be treated as a *stream* error, not a *connection* error.
+ # That means we need to catch the error and forcibly close the stream.
+ if events:
+ events[0].delta = increment
+ try:
+ self.outbound_flow_control_window = guard_increment_window(
+ self.outbound_flow_control_window,
+ increment
+ )
+ except FlowControlError:
+ # Ok, this is bad. We're going to need to perform a local
+ # reset.
+ event = StreamReset()
+ event.stream_id = self.stream_id
+ event.error_code = ErrorCodes.FLOW_CONTROL_ERROR
+ event.remote_reset = False
+
+ events = [event]
+ frames = self.reset_stream(event.error_code)
+
+ return frames, events
+
+ def receive_continuation(self):
+ """
+ A naked CONTINUATION frame has been received. This is always an error,
+ but the type of error it is depends on the state of the stream and must
+ transition the state of the stream, so we need to handle it.
+ """
+ self.config.logger.debug("Receive Continuation frame on %r", self)
+ self.state_machine.process_input(
+ StreamInputs.RECV_CONTINUATION
+ )
+ assert False, "Should not be reachable"
+
+ def receive_alt_svc(self, frame):
+ """
+ An Alternative Service frame was received on the stream. This frame
+ inherits the origin associated with this stream.
+ """
+ self.config.logger.debug(
+ "Receive Alternative Service frame on stream %r", self
+ )
+
+ # If the origin is present, RFC 7838 says we have to ignore it.
+ if frame.origin:
+ return [], []
+
+ events = self.state_machine.process_input(
+ StreamInputs.RECV_ALTERNATIVE_SERVICE
+ )
+
+ # There are lots of situations where we want to ignore the ALTSVC
+ # frame. If we need to pay attention, we'll have an event and should
+ # fill it out.
+ if events:
+ assert isinstance(events[0], AlternativeServiceAvailable)
+ events[0].origin = self._authority
+ events[0].field_value = frame.field
+
+ return [], events
+
+ def reset_stream(self, error_code=0):
+ """
+ Close the stream locally. Reset the stream with an error code.
+ """
+ self.config.logger.debug(
+ "Local reset %r with error code: %d", self, error_code
+ )
+ self.state_machine.process_input(StreamInputs.SEND_RST_STREAM)
+
+ rsf = RstStreamFrame(self.stream_id)
+ rsf.error_code = error_code
+ return [rsf]
+
+ def stream_reset(self, frame):
+ """
+ Handle a stream being reset remotely.
+ """
+ self.config.logger.debug(
+ "Remote reset %r with error code: %d", self, frame.error_code
+ )
+ events = self.state_machine.process_input(StreamInputs.RECV_RST_STREAM)
+
+ if events:
+ # We don't fire an event if this stream is already closed.
+ events[0].error_code = _error_code_from_int(frame.error_code)
+
+ return [], events
+
+ def acknowledge_received_data(self, acknowledged_size):
+ """
+ The user has informed us that they've processed some amount of data
+ that was received on this stream. Pass that to the window manager and
+ potentially return some WindowUpdate frames.
+ """
+ self.config.logger.debug(
+ "Acknowledge received data with size %d on %r",
+ acknowledged_size, self
+ )
+ increment = self._inbound_window_manager.process_bytes(
+ acknowledged_size
+ )
+ if increment:
+ f = WindowUpdateFrame(self.stream_id)
+ f.window_increment = increment
+ return [f]
+
+ return []
+
+ def _build_hdr_validation_flags(self, events):
+ """
+ Constructs a set of header validation flags for use when normalizing
+ and validating header blocks.
+ """
+ is_trailer = isinstance(
+ events[0], (_TrailersSent, TrailersReceived)
+ )
+ is_response_header = isinstance(
+ events[0],
+ (
+ _ResponseSent,
+ ResponseReceived,
+ InformationalResponseReceived
+ )
+ )
+ is_push_promise = isinstance(
+ events[0], (PushedStreamReceived, _PushedRequestSent)
+ )
+
+ return HeaderValidationFlags(
+ is_client=self.state_machine.client,
+ is_trailer=is_trailer,
+ is_response_header=is_response_header,
+ is_push_promise=is_push_promise,
+ )
+
+ def _build_headers_frames(self,
+ headers,
+ encoder,
+ first_frame,
+ hdr_validation_flags):
+ """
+ Helper method to build headers or push promise frames.
+ """
+ # We need to lowercase the header names, and to ensure that secure
+ # header fields are kept out of compression contexts.
+ if self.config.normalize_outbound_headers:
+ headers = normalize_outbound_headers(
+ headers, hdr_validation_flags
+ )
+ if self.config.validate_outbound_headers:
+ headers = validate_outbound_headers(
+ headers, hdr_validation_flags
+ )
+
+ encoded_headers = encoder.encode(headers)
+
+ # Slice into blocks of max_outbound_frame_size. Be careful with this:
+ # it only works right because we never send padded frames or priority
+ # information on the frames. Revisit this if we do.
+ header_blocks = [
+ encoded_headers[i:i+self.max_outbound_frame_size]
+ for i in range(
+ 0, len(encoded_headers), self.max_outbound_frame_size
+ )
+ ]
+
+ frames = []
+ first_frame.data = header_blocks[0]
+ frames.append(first_frame)
+
+ for block in header_blocks[1:]:
+ cf = ContinuationFrame(self.stream_id)
+ cf.data = block
+ frames.append(cf)
+
+ frames[-1].flags.add('END_HEADERS')
+ return frames
+
+ def _process_received_headers(self,
+ headers,
+ header_validation_flags,
+ header_encoding):
+ """
+ When headers have been received from the remote peer, run a processing
+ pipeline on them to transform them into the appropriate form for
+ attaching to an event.
+ """
+ if self.config.normalize_inbound_headers:
+ headers = normalize_inbound_headers(
+ headers, header_validation_flags
+ )
+
+ if self.config.validate_inbound_headers:
+ headers = validate_headers(headers, header_validation_flags)
+
+ if header_encoding:
+ headers = _decode_headers(headers, header_encoding)
+
+ # The above steps are all generators, so we need to concretize the
+ # headers now.
+ return list(headers)
+
+ def _initialize_content_length(self, headers):
+ """
+ Checks the headers for a content-length header and initializes the
+ _expected_content_length field from it. It's not an error for no
+ Content-Length header to be present.
+ """
+ if self.request_method == b'HEAD':
+ self._expected_content_length = 0
+ return
+
+ for n, v in headers:
+ if n == b'content-length':
+ try:
+ self._expected_content_length = int(v, 10)
+ except ValueError:
+ raise ProtocolError(
+ "Invalid content-length header: %s" % v
+ )
+
+ return
+
+ def _track_content_length(self, length, end_stream):
+ """
+ Update the expected content length in response to data being received.
+ Validates that the appropriate amount of data is sent. Always updates
+ the received data, but only validates the length against the
+ content-length header if one was sent.
+
+ :param length: The length of the body chunk received.
+ :param end_stream: If this is the last body chunk received.
+ """
+ self._actual_content_length += length
+ actual = self._actual_content_length
+ expected = self._expected_content_length
+
+ if expected is not None:
+ if expected < actual:
+ raise InvalidBodyLengthError(expected, actual)
+
+ if end_stream and expected != actual:
+ raise InvalidBodyLengthError(expected, actual)
+
+ def _inbound_flow_control_change_from_settings(self, delta):
+ """
+ We changed SETTINGS_INITIAL_WINDOW_SIZE, which means we need to
+ update the target window size for flow control. For our flow control
+ strategy, this means we need to do two things: we need to adjust the
+ current window size, but we also need to set the target maximum window
+ size to the new value.
+ """
+ new_max_size = self._inbound_window_manager.max_window_size + delta
+ self._inbound_window_manager.window_opened(delta)
+ self._inbound_window_manager.max_window_size = new_max_size
+
+
+def _decode_headers(headers, encoding):
+ """
+ Given an iterable of header two-tuples and an encoding, decodes those
+ headers using that encoding while preserving the type of the header tuple.
+ This ensures that the use of ``HeaderTuple`` is preserved.
+ """
+ for header in headers:
+ # This function expects to work on decoded headers, which are always
+ # HeaderTuple objects.
+ assert isinstance(header, HeaderTuple)
+
+ name, value = header
+ name = name.decode(encoding)
+ value = value.decode(encoding)
+ yield header.__class__(name, value)
diff --git a/.venv/lib/python3.9/site-packages/h2/utilities.py b/.venv/lib/python3.9/site-packages/h2/utilities.py
new file mode 100644
index 0000000..eb07f57
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2/utilities.py
@@ -0,0 +1,656 @@
+# -*- coding: utf-8 -*-
+"""
+h2/utilities
+~~~~~~~~~~~~
+
+Utility functions that do not belong in a separate module.
+"""
+import collections
+import re
+from string import whitespace
+
+from hpack import HeaderTuple, NeverIndexedHeaderTuple
+
+from .exceptions import ProtocolError, FlowControlError
+
+UPPER_RE = re.compile(b"[A-Z]")
+
+# A set of headers that are hop-by-hop or connection-specific and thus
+# forbidden in HTTP/2. This list comes from RFC 7540 § 8.1.2.2.
+CONNECTION_HEADERS = frozenset([
+ b'connection', u'connection',
+ b'proxy-connection', u'proxy-connection',
+ b'keep-alive', u'keep-alive',
+ b'transfer-encoding', u'transfer-encoding',
+ b'upgrade', u'upgrade',
+])
+
+
+_ALLOWED_PSEUDO_HEADER_FIELDS = frozenset([
+ b':method', u':method',
+ b':scheme', u':scheme',
+ b':authority', u':authority',
+ b':path', u':path',
+ b':status', u':status',
+ b':protocol', u':protocol',
+])
+
+
+_SECURE_HEADERS = frozenset([
+ # May have basic credentials which are vulnerable to dictionary attacks.
+ b'authorization', u'authorization',
+ b'proxy-authorization', u'proxy-authorization',
+])
+
+
+_REQUEST_ONLY_HEADERS = frozenset([
+ b':scheme', u':scheme',
+ b':path', u':path',
+ b':authority', u':authority',
+ b':method', u':method',
+ b':protocol', u':protocol',
+])
+
+
+_RESPONSE_ONLY_HEADERS = frozenset([b':status', u':status'])
+
+
+# A Set of pseudo headers that are only valid if the method is
+# CONNECT, see RFC 8441 § 5
+_CONNECT_REQUEST_ONLY_HEADERS = frozenset([b':protocol', u':protocol'])
+
+
+_WHITESPACE = frozenset(map(ord, whitespace))
+
+
+def _secure_headers(headers, hdr_validation_flags):
+ """
+ Certain headers are at risk of being attacked during the header compression
+ phase, and so need to be kept out of header compression contexts. This
+ function automatically transforms certain specific headers into HPACK
+ never-indexed fields to ensure they don't get added to header compression
+ contexts.
+
+ This function currently implements two rules:
+
+ - 'authorization' and 'proxy-authorization' fields are automatically made
+ never-indexed.
+ - Any 'cookie' header field shorter than 20 bytes long is made
+ never-indexed.
+
+ These fields are the most at-risk. These rules are inspired by Firefox
+ and nghttp2.
+ """
+ for header in headers:
+ if header[0] in _SECURE_HEADERS:
+ yield NeverIndexedHeaderTuple(*header)
+ elif header[0] in (b'cookie', u'cookie') and len(header[1]) < 20:
+ yield NeverIndexedHeaderTuple(*header)
+ else:
+ yield header
+
+
+def extract_method_header(headers):
+ """
+ Extracts the request method from the headers list.
+ """
+ for k, v in headers:
+ if k in (b':method', u':method'):
+ if not isinstance(v, bytes):
+ return v.encode('utf-8')
+ else:
+ return v
+
+
+def is_informational_response(headers):
+ """
+ Searches a header block for a :status header to confirm that a given
+ collection of headers are an informational response. Assumes the header
+ block is well formed: that is, that the HTTP/2 special headers are first
+ in the block, and so that it can stop looking when it finds the first
+ header field whose name does not begin with a colon.
+
+ :param headers: The HTTP/2 header block.
+ :returns: A boolean indicating if this is an informational response.
+ """
+ for n, v in headers:
+ if isinstance(n, bytes):
+ sigil = b':'
+ status = b':status'
+ informational_start = b'1'
+ else:
+ sigil = u':'
+ status = u':status'
+ informational_start = u'1'
+
+ # If we find a non-special header, we're done here: stop looping.
+ if not n.startswith(sigil):
+ return False
+
+ # This isn't the status header, bail.
+ if n != status:
+ continue
+
+ # If the first digit is a 1, we've got informational headers.
+ return v.startswith(informational_start)
+
+
+def guard_increment_window(current, increment):
+ """
+ Increments a flow control window, guarding against that window becoming too
+ large.
+
+ :param current: The current value of the flow control window.
+ :param increment: The increment to apply to that window.
+ :returns: The new value of the window.
+ :raises: ``FlowControlError``
+ """
+ # The largest value the flow control window may take.
+ LARGEST_FLOW_CONTROL_WINDOW = 2**31 - 1
+
+ new_size = current + increment
+
+ if new_size > LARGEST_FLOW_CONTROL_WINDOW:
+ raise FlowControlError(
+ "May not increment flow control window past %d" %
+ LARGEST_FLOW_CONTROL_WINDOW
+ )
+
+ return new_size
+
+
+def authority_from_headers(headers):
+ """
+ Given a header set, searches for the authority header and returns the
+ value.
+
+ Note that this doesn't terminate early, so should only be called if the
+ headers are for a client request. Otherwise, will loop over the entire
+ header set, which is potentially unwise.
+
+ :param headers: The HTTP header set.
+ :returns: The value of the authority header, or ``None``.
+ :rtype: ``bytes`` or ``None``.
+ """
+ for n, v in headers:
+ # This gets run against headers that come both from HPACK and from the
+ # user, so we may have unicode floating around in here. We only want
+ # bytes.
+ if n in (b':authority', u':authority'):
+ return v.encode('utf-8') if not isinstance(v, bytes) else v
+
+ return None
+
+
+# Flags used by the validate_headers pipeline to determine which checks
+# should be applied to a given set of headers.
+HeaderValidationFlags = collections.namedtuple(
+ 'HeaderValidationFlags',
+ ['is_client', 'is_trailer', 'is_response_header', 'is_push_promise']
+)
+
+
+def validate_headers(headers, hdr_validation_flags):
+ """
+ Validates a header sequence against a set of constraints from RFC 7540.
+
+ :param headers: The HTTP header set.
+ :param hdr_validation_flags: An instance of HeaderValidationFlags.
+ """
+ # This validation logic is built on a sequence of generators that are
+ # iterated over to provide the final header list. This reduces some of the
+ # overhead of doing this checking. However, it's worth noting that this
+ # checking remains somewhat expensive, and attempts should be made wherever
+ # possible to reduce the time spent doing them.
+ #
+ # For example, we avoid tuple upacking in loops because it represents a
+ # fixed cost that we don't want to spend, instead indexing into the header
+ # tuples.
+ headers = _reject_uppercase_header_fields(
+ headers, hdr_validation_flags
+ )
+ headers = _reject_surrounding_whitespace(
+ headers, hdr_validation_flags
+ )
+ headers = _reject_te(
+ headers, hdr_validation_flags
+ )
+ headers = _reject_connection_header(
+ headers, hdr_validation_flags
+ )
+ headers = _reject_pseudo_header_fields(
+ headers, hdr_validation_flags
+ )
+ headers = _check_host_authority_header(
+ headers, hdr_validation_flags
+ )
+ headers = _check_path_header(headers, hdr_validation_flags)
+
+ return headers
+
+
+def _reject_uppercase_header_fields(headers, hdr_validation_flags):
+ """
+ Raises a ProtocolError if any uppercase character is found in a header
+ block.
+ """
+ for header in headers:
+ if UPPER_RE.search(header[0]):
+ raise ProtocolError(
+ "Received uppercase header name %s." % header[0])
+ yield header
+
+
+def _reject_surrounding_whitespace(headers, hdr_validation_flags):
+ """
+ Raises a ProtocolError if any header name or value is surrounded by
+ whitespace characters.
+ """
+ # For compatibility with RFC 7230 header fields, we need to allow the field
+ # value to be an empty string. This is ludicrous, but technically allowed.
+ # The field name may not be empty, though, so we can safely assume that it
+ # must have at least one character in it and throw exceptions if it
+ # doesn't.
+ for header in headers:
+ if header[0][0] in _WHITESPACE or header[0][-1] in _WHITESPACE:
+ raise ProtocolError(
+ "Received header name surrounded by whitespace %r" % header[0])
+ if header[1] and ((header[1][0] in _WHITESPACE) or
+ (header[1][-1] in _WHITESPACE)):
+ raise ProtocolError(
+ "Received header value surrounded by whitespace %r" % header[1]
+ )
+ yield header
+
+
+def _reject_te(headers, hdr_validation_flags):
+ """
+ Raises a ProtocolError if the TE header is present in a header block and
+ its value is anything other than "trailers".
+ """
+ for header in headers:
+ if header[0] in (b'te', u'te'):
+ if header[1].lower() not in (b'trailers', u'trailers'):
+ raise ProtocolError(
+ "Invalid value for Transfer-Encoding header: %s" %
+ header[1]
+ )
+
+ yield header
+
+
+def _reject_connection_header(headers, hdr_validation_flags):
+ """
+ Raises a ProtocolError if the Connection header is present in a header
+ block.
+ """
+ for header in headers:
+ if header[0] in CONNECTION_HEADERS:
+ raise ProtocolError(
+ "Connection-specific header field present: %s." % header[0]
+ )
+
+ yield header
+
+
+def _custom_startswith(test_string, bytes_prefix, unicode_prefix):
+ """
+ Given a string that might be a bytestring or a Unicode string,
+ return True if it starts with the appropriate prefix.
+ """
+ if isinstance(test_string, bytes):
+ return test_string.startswith(bytes_prefix)
+ else:
+ return test_string.startswith(unicode_prefix)
+
+
+def _assert_header_in_set(string_header, bytes_header, header_set):
+ """
+ Given a set of header names, checks whether the string or byte version of
+ the header name is present. Raises a Protocol error with the appropriate
+ error if it's missing.
+ """
+ if not (string_header in header_set or bytes_header in header_set):
+ raise ProtocolError(
+ "Header block missing mandatory %s header" % string_header
+ )
+
+
+def _reject_pseudo_header_fields(headers, hdr_validation_flags):
+ """
+ Raises a ProtocolError if duplicate pseudo-header fields are found in a
+ header block or if a pseudo-header field appears in a block after an
+ ordinary header field.
+
+ Raises a ProtocolError if pseudo-header fields are found in trailers.
+ """
+ seen_pseudo_header_fields = set()
+ seen_regular_header = False
+ method = None
+
+ for header in headers:
+ if _custom_startswith(header[0], b':', u':'):
+ if header[0] in seen_pseudo_header_fields:
+ raise ProtocolError(
+ "Received duplicate pseudo-header field %s" % header[0]
+ )
+
+ seen_pseudo_header_fields.add(header[0])
+
+ if seen_regular_header:
+ raise ProtocolError(
+ "Received pseudo-header field out of sequence: %s" %
+ header[0]
+ )
+
+ if header[0] not in _ALLOWED_PSEUDO_HEADER_FIELDS:
+ raise ProtocolError(
+ "Received custom pseudo-header field %s" % header[0]
+ )
+
+ if header[0] in (b':method', u':method'):
+ if not isinstance(header[1], bytes):
+ method = header[1].encode('utf-8')
+ else:
+ method = header[1]
+
+ else:
+ seen_regular_header = True
+
+ yield header
+
+ # Check the pseudo-headers we got to confirm they're acceptable.
+ _check_pseudo_header_field_acceptability(
+ seen_pseudo_header_fields, method, hdr_validation_flags
+ )
+
+
+def _check_pseudo_header_field_acceptability(pseudo_headers,
+ method,
+ hdr_validation_flags):
+ """
+ Given the set of pseudo-headers present in a header block and the
+ validation flags, confirms that RFC 7540 allows them.
+ """
+ # Pseudo-header fields MUST NOT appear in trailers - RFC 7540 § 8.1.2.1
+ if hdr_validation_flags.is_trailer and pseudo_headers:
+ raise ProtocolError(
+ "Received pseudo-header in trailer %s" % pseudo_headers
+ )
+
+ # If ':status' pseudo-header is not there in a response header, reject it.
+ # Similarly, if ':path', ':method', or ':scheme' are not there in a request
+ # header, reject it. Additionally, if a response contains any request-only
+ # headers or vice-versa, reject it.
+ # Relevant RFC section: RFC 7540 § 8.1.2.4
+ # https://tools.ietf.org/html/rfc7540#section-8.1.2.4
+ if hdr_validation_flags.is_response_header:
+ _assert_header_in_set(u':status', b':status', pseudo_headers)
+ invalid_response_headers = pseudo_headers & _REQUEST_ONLY_HEADERS
+ if invalid_response_headers:
+ raise ProtocolError(
+ "Encountered request-only headers %s" %
+ invalid_response_headers
+ )
+ elif (not hdr_validation_flags.is_response_header and
+ not hdr_validation_flags.is_trailer):
+ # This is a request, so we need to have seen :path, :method, and
+ # :scheme.
+ _assert_header_in_set(u':path', b':path', pseudo_headers)
+ _assert_header_in_set(u':method', b':method', pseudo_headers)
+ _assert_header_in_set(u':scheme', b':scheme', pseudo_headers)
+ invalid_request_headers = pseudo_headers & _RESPONSE_ONLY_HEADERS
+ if invalid_request_headers:
+ raise ProtocolError(
+ "Encountered response-only headers %s" %
+ invalid_request_headers
+ )
+ if method != b'CONNECT':
+ invalid_headers = pseudo_headers & _CONNECT_REQUEST_ONLY_HEADERS
+ if invalid_headers:
+ raise ProtocolError(
+ "Encountered connect-request-only headers %s" %
+ invalid_headers
+ )
+
+
+def _validate_host_authority_header(headers):
+ """
+ Given the :authority and Host headers from a request block that isn't
+ a trailer, check that:
+ 1. At least one of these headers is set.
+ 2. If both headers are set, they match.
+
+ :param headers: The HTTP header set.
+ :raises: ``ProtocolError``
+ """
+ # We use None as a sentinel value. Iterate over the list of headers,
+ # and record the value of these headers (if present). We don't need
+ # to worry about receiving duplicate :authority headers, as this is
+ # enforced by the _reject_pseudo_header_fields() pipeline.
+ #
+ # TODO: We should also guard against receiving duplicate Host headers,
+ # and against sending duplicate headers.
+ authority_header_val = None
+ host_header_val = None
+
+ for header in headers:
+ if header[0] in (b':authority', u':authority'):
+ authority_header_val = header[1]
+ elif header[0] in (b'host', u'host'):
+ host_header_val = header[1]
+
+ yield header
+
+ # If we have not-None values for these variables, then we know we saw
+ # the corresponding header.
+ authority_present = (authority_header_val is not None)
+ host_present = (host_header_val is not None)
+
+ # It is an error for a request header block to contain neither
+ # an :authority header nor a Host header.
+ if not authority_present and not host_present:
+ raise ProtocolError(
+ "Request header block does not have an :authority or Host header."
+ )
+
+ # If we receive both headers, they should definitely match.
+ if authority_present and host_present:
+ if authority_header_val != host_header_val:
+ raise ProtocolError(
+ "Request header block has mismatched :authority and "
+ "Host headers: %r / %r"
+ % (authority_header_val, host_header_val)
+ )
+
+
+def _check_host_authority_header(headers, hdr_validation_flags):
+ """
+ Raises a ProtocolError if a header block arrives that does not contain an
+ :authority or a Host header, or if a header block contains both fields,
+ but their values do not match.
+ """
+ # We only expect to see :authority and Host headers on request header
+ # blocks that aren't trailers, so skip this validation if this is a
+ # response header or we're looking at trailer blocks.
+ skip_validation = (
+ hdr_validation_flags.is_response_header or
+ hdr_validation_flags.is_trailer
+ )
+ if skip_validation:
+ return headers
+
+ return _validate_host_authority_header(headers)
+
+
+def _check_path_header(headers, hdr_validation_flags):
+ """
+ Raise a ProtocolError if a header block arrives or is sent that contains an
+ empty :path header.
+ """
+ def inner():
+ for header in headers:
+ if header[0] in (b':path', u':path'):
+ if not header[1]:
+ raise ProtocolError("An empty :path header is forbidden")
+
+ yield header
+
+ # We only expect to see :authority and Host headers on request header
+ # blocks that aren't trailers, so skip this validation if this is a
+ # response header or we're looking at trailer blocks.
+ skip_validation = (
+ hdr_validation_flags.is_response_header or
+ hdr_validation_flags.is_trailer
+ )
+ if skip_validation:
+ return headers
+ else:
+ return inner()
+
+
+def _lowercase_header_names(headers, hdr_validation_flags):
+ """
+ Given an iterable of header two-tuples, rebuilds that iterable with the
+ header names lowercased. This generator produces tuples that preserve the
+ original type of the header tuple for tuple and any ``HeaderTuple``.
+ """
+ for header in headers:
+ if isinstance(header, HeaderTuple):
+ yield header.__class__(header[0].lower(), header[1])
+ else:
+ yield (header[0].lower(), header[1])
+
+
+def _strip_surrounding_whitespace(headers, hdr_validation_flags):
+ """
+ Given an iterable of header two-tuples, strip both leading and trailing
+ whitespace from both header names and header values. This generator
+ produces tuples that preserve the original type of the header tuple for
+ tuple and any ``HeaderTuple``.
+ """
+ for header in headers:
+ if isinstance(header, HeaderTuple):
+ yield header.__class__(header[0].strip(), header[1].strip())
+ else:
+ yield (header[0].strip(), header[1].strip())
+
+
+def _strip_connection_headers(headers, hdr_validation_flags):
+ """
+ Strip any connection headers as per RFC7540 § 8.1.2.2.
+ """
+ for header in headers:
+ if header[0] not in CONNECTION_HEADERS:
+ yield header
+
+
+def _check_sent_host_authority_header(headers, hdr_validation_flags):
+ """
+ Raises an InvalidHeaderBlockError if we try to send a header block
+ that does not contain an :authority or a Host header, or if
+ the header block contains both fields, but their values do not match.
+ """
+ # We only expect to see :authority and Host headers on request header
+ # blocks that aren't trailers, so skip this validation if this is a
+ # response header or we're looking at trailer blocks.
+ skip_validation = (
+ hdr_validation_flags.is_response_header or
+ hdr_validation_flags.is_trailer
+ )
+ if skip_validation:
+ return headers
+
+ return _validate_host_authority_header(headers)
+
+
+def _combine_cookie_fields(headers, hdr_validation_flags):
+ """
+ RFC 7540 § 8.1.2.5 allows HTTP/2 clients to split the Cookie header field,
+ which must normally appear only once, into multiple fields for better
+ compression. However, they MUST be joined back up again when received.
+ This normalization step applies that transform. The side-effect is that
+ all cookie fields now appear *last* in the header block.
+ """
+ # There is a problem here about header indexing. Specifically, it's
+ # possible that all these cookies are sent with different header indexing
+ # values. At this point it shouldn't matter too much, so we apply our own
+ # logic and make them never-indexed.
+ cookies = []
+ for header in headers:
+ if header[0] == b'cookie':
+ cookies.append(header[1])
+ else:
+ yield header
+ if cookies:
+ cookie_val = b'; '.join(cookies)
+ yield NeverIndexedHeaderTuple(b'cookie', cookie_val)
+
+
+def normalize_outbound_headers(headers, hdr_validation_flags):
+ """
+ Normalizes a header sequence that we are about to send.
+
+ :param headers: The HTTP header set.
+ :param hdr_validation_flags: An instance of HeaderValidationFlags.
+ """
+ headers = _lowercase_header_names(headers, hdr_validation_flags)
+ headers = _strip_surrounding_whitespace(headers, hdr_validation_flags)
+ headers = _strip_connection_headers(headers, hdr_validation_flags)
+ headers = _secure_headers(headers, hdr_validation_flags)
+
+ return headers
+
+
+def normalize_inbound_headers(headers, hdr_validation_flags):
+ """
+ Normalizes a header sequence that we have received.
+
+ :param headers: The HTTP header set.
+ :param hdr_validation_flags: An instance of HeaderValidationFlags
+ """
+ headers = _combine_cookie_fields(headers, hdr_validation_flags)
+ return headers
+
+
+def validate_outbound_headers(headers, hdr_validation_flags):
+ """
+ Validates and normalizes a header sequence that we are about to send.
+
+ :param headers: The HTTP header set.
+ :param hdr_validation_flags: An instance of HeaderValidationFlags.
+ """
+ headers = _reject_te(
+ headers, hdr_validation_flags
+ )
+ headers = _reject_connection_header(
+ headers, hdr_validation_flags
+ )
+ headers = _reject_pseudo_header_fields(
+ headers, hdr_validation_flags
+ )
+ headers = _check_sent_host_authority_header(
+ headers, hdr_validation_flags
+ )
+ headers = _check_path_header(headers, hdr_validation_flags)
+
+ return headers
+
+
+class SizeLimitDict(collections.OrderedDict):
+
+ def __init__(self, *args, **kwargs):
+ self._size_limit = kwargs.pop("size_limit", None)
+ super(SizeLimitDict, self).__init__(*args, **kwargs)
+
+ self._check_size_limit()
+
+ def __setitem__(self, key, value):
+ super(SizeLimitDict, self).__setitem__(key, value)
+
+ self._check_size_limit()
+
+ def _check_size_limit(self):
+ if self._size_limit is not None:
+ while len(self) > self._size_limit:
+ self.popitem(last=False)
diff --git a/.venv/lib/python3.9/site-packages/h2/windows.py b/.venv/lib/python3.9/site-packages/h2/windows.py
new file mode 100644
index 0000000..be4eb43
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/h2/windows.py
@@ -0,0 +1,139 @@
+# -*- coding: utf-8 -*-
+"""
+h2/windows
+~~~~~~~~~~
+
+Defines tools for managing HTTP/2 flow control windows.
+
+The objects defined in this module are used to automatically manage HTTP/2
+flow control windows. Specifically, they keep track of what the size of the
+window is, how much data has been consumed from that window, and how much data
+the user has already used. It then implements a basic algorithm that attempts
+to manage the flow control window without user input, trying to ensure that it
+does not emit too many WINDOW_UPDATE frames.
+"""
+from __future__ import division
+
+from .exceptions import FlowControlError
+
+
+# The largest acceptable value for a HTTP/2 flow control window.
+LARGEST_FLOW_CONTROL_WINDOW = 2**31 - 1
+
+
+class WindowManager:
+ """
+ A basic HTTP/2 window manager.
+
+ :param max_window_size: The maximum size of the flow control window.
+ :type max_window_size: ``int``
+ """
+ def __init__(self, max_window_size):
+ assert max_window_size <= LARGEST_FLOW_CONTROL_WINDOW
+ self.max_window_size = max_window_size
+ self.current_window_size = max_window_size
+ self._bytes_processed = 0
+
+ def window_consumed(self, size):
+ """
+ We have received a certain number of bytes from the remote peer. This
+ necessarily shrinks the flow control window!
+
+ :param size: The number of flow controlled bytes we received from the
+ remote peer.
+ :type size: ``int``
+ :returns: Nothing.
+ :rtype: ``None``
+ """
+ self.current_window_size -= size
+ if self.current_window_size < 0:
+ raise FlowControlError("Flow control window shrunk below 0")
+
+ def window_opened(self, size):
+ """
+ The flow control window has been incremented, either because of manual
+ flow control management or because of the user changing the flow
+ control settings. This can have the effect of increasing what we
+ consider to be the "maximum" flow control window size.
+
+ This does not increase our view of how many bytes have been processed,
+ only of how much space is in the window.
+
+ :param size: The increment to the flow control window we received.
+ :type size: ``int``
+ :returns: Nothing
+ :rtype: ``None``
+ """
+ self.current_window_size += size
+
+ if self.current_window_size > LARGEST_FLOW_CONTROL_WINDOW:
+ raise FlowControlError(
+ "Flow control window mustn't exceed %d" %
+ LARGEST_FLOW_CONTROL_WINDOW
+ )
+
+ if self.current_window_size > self.max_window_size:
+ self.max_window_size = self.current_window_size
+
+ def process_bytes(self, size):
+ """
+ The application has informed us that it has processed a certain number
+ of bytes. This may cause us to want to emit a window update frame. If
+ we do want to emit a window update frame, this method will return the
+ number of bytes that we should increment the window by.
+
+ :param size: The number of flow controlled bytes that the application
+ has processed.
+ :type size: ``int``
+ :returns: The number of bytes to increment the flow control window by,
+ or ``None``.
+ :rtype: ``int`` or ``None``
+ """
+ self._bytes_processed += size
+ return self._maybe_update_window()
+
+ def _maybe_update_window(self):
+ """
+ Run the algorithm.
+
+ Our current algorithm can be described like this.
+
+ 1. If no bytes have been processed, we immediately return 0. There is
+ no meaningful way for us to hand space in the window back to the
+ remote peer, so let's not even try.
+ 2. If there is no space in the flow control window, and we have
+ processed at least 1024 bytes (or 1/4 of the window, if the window
+ is smaller), we will emit a window update frame. This is to avoid
+ the risk of blocking a stream altogether.
+ 3. If there is space in the flow control window, and we have processed
+ at least 1/2 of the window worth of bytes, we will emit a window
+ update frame. This is to minimise the number of window update frames
+ we have to emit.
+
+ In a healthy system with large flow control windows, this will
+ irregularly emit WINDOW_UPDATE frames. This prevents us starving the
+ connection by emitting eleventy bajillion WINDOW_UPDATE frames,
+ especially in situations where the remote peer is sending a lot of very
+ small DATA frames.
+ """
+ # TODO: Can the window be smaller than 1024 bytes? If not, we can
+ # streamline this algorithm.
+ if not self._bytes_processed:
+ return None
+
+ max_increment = (self.max_window_size - self.current_window_size)
+ increment = 0
+
+ # Note that, even though we may increment less than _bytes_processed,
+ # we still want to set it to zero whenever we emit an increment. This
+ # is because we'll always increment up to the maximum we can.
+ if (self.current_window_size == 0) and (
+ self._bytes_processed > min(1024, self.max_window_size // 4)):
+ increment = min(self._bytes_processed, max_increment)
+ self._bytes_processed = 0
+ elif self._bytes_processed >= (self.max_window_size // 2):
+ increment = min(self._bytes_processed, max_increment)
+ self._bytes_processed = 0
+
+ self.current_window_size += increment
+ return increment
diff --git a/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/INSTALLER b/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/INSTALLER
new file mode 100644
index 0000000..a1b589e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/INSTALLER
@@ -0,0 +1 @@
+pip
diff --git a/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/LICENSE b/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/LICENSE
new file mode 100644
index 0000000..d24c351
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/LICENSE
@@ -0,0 +1,21 @@
+The MIT License (MIT)
+
+Copyright (c) 2014 Cory Benfield
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in
+all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+THE SOFTWARE.
diff --git a/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/METADATA b/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/METADATA
new file mode 100644
index 0000000..bda8b9e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/METADATA
@@ -0,0 +1,69 @@
+Metadata-Version: 2.1
+Name: hpack
+Version: 4.0.0
+Summary: Pure-Python HPACK header compression
+Home-page: https://github.com/python-hyper/hpack
+Author: Cory Benfield
+Author-email: cory@lukasa.co.uk
+License: MIT License
+Platform: UNKNOWN
+Classifier: Development Status :: 5 - Production/Stable
+Classifier: Intended Audience :: Developers
+Classifier: License :: OSI Approved :: MIT License
+Classifier: Programming Language :: Python
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3.6
+Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Programming Language :: Python :: Implementation :: CPython
+Classifier: Programming Language :: Python :: Implementation :: PyPy
+Requires-Python: >=3.6.1
+Description-Content-Type: text/x-rst
+
+========================================
+hpack: HTTP/2 Header Encoding for Python
+========================================
+
+.. image:: https://github.com/python-hyper/hpack/workflows/CI/badge.svg
+ :target: https://github.com/python-hyper/hpack/actions
+ :alt: Build Status
+.. image:: https://codecov.io/gh/python-hyper/hpack/branch/master/graph/badge.svg
+ :target: https://codecov.io/gh/python-hyper/hpack
+ :alt: Code Coverage
+.. image:: https://readthedocs.org/projects/hpack/badge/?version=latest
+ :target: https://hpack.readthedocs.io/en/latest/
+ :alt: Documentation Status
+.. image:: https://img.shields.io/badge/chat-join_now-brightgreen.svg
+ :target: https://gitter.im/python-hyper/community
+ :alt: Chat community
+
+.. image:: https://raw.github.com/Lukasa/hyper/development/docs/source/images/hyper.png
+
+This module contains a pure-Python HTTP/2 header encoding (HPACK) logic for use
+in Python programs that implement HTTP/2.
+
+Contributing
+============
+
+``hpack`` welcomes contributions from anyone! Unlike many other projects we are
+happy to accept cosmetic contributions and small contributions, in addition to
+large feature requests and changes.
+
+Before you contribute (either by opening an issue or filing a pull request),
+please `read the contribution guidelines`_.
+
+.. _read the contribution guidelines: http://hyper.readthedocs.org/en/development/contributing.html
+
+License
+=======
+
+``hpack`` is made available under the MIT License. For more details, see the
+``LICENSE`` file in the repository.
+
+Authors
+=======
+
+``hpack`` is maintained by Cory Benfield, with contributions from others. For
+more details about the contributors, please see ``CONTRIBUTORS.rst``.
+
+
diff --git a/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/RECORD b/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/RECORD
new file mode 100644
index 0000000..526357b
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/RECORD
@@ -0,0 +1,22 @@
+hpack-4.0.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
+hpack-4.0.0.dist-info/LICENSE,sha256=djqTQqBN9iBGydx0ilKHk06wpTMcaGOzygruIOGMtO0,1080
+hpack-4.0.0.dist-info/METADATA,sha256=qvjQZKJbV8sOvv6Xp_6VPcp_zuJMWUg81Q5XQYNoyEI,2533
+hpack-4.0.0.dist-info/RECORD,,
+hpack-4.0.0.dist-info/WHEEL,sha256=g4nMs7d-Xl9-xC9XovUrsDHGXt-FT0E17Yqo92DEfvY,92
+hpack-4.0.0.dist-info/top_level.txt,sha256=nyrZLbQo-0nC6ot3YO_109pkUiTtK1M0wUJfLHuQceE,6
+hpack/__init__.py,sha256=1v7RdDmqOTWOFJ4WulrMdwHNBfQNjP0XRlpw5KFya3U,568
+hpack/__pycache__/__init__.cpython-39.pyc,,
+hpack/__pycache__/exceptions.cpython-39.pyc,,
+hpack/__pycache__/hpack.cpython-39.pyc,,
+hpack/__pycache__/huffman.cpython-39.pyc,,
+hpack/__pycache__/huffman_constants.cpython-39.pyc,,
+hpack/__pycache__/huffman_table.cpython-39.pyc,,
+hpack/__pycache__/struct.cpython-39.pyc,,
+hpack/__pycache__/table.cpython-39.pyc,,
+hpack/exceptions.py,sha256=Rrw1Fke5Cfq72OwvwrnS6JucBvw3_bxu3o_ujtAUlrM,974
+hpack/hpack.py,sha256=qoOPcsqMdE6DzqIIoxs-UY-rvMzPn4AStqhFfbHzUBw,22683
+hpack/huffman.py,sha256=0efgsl0EjRoRCI0PuX8G9yK4P48dmOSD5pepMbkxSTY,2443
+hpack/huffman_constants.py,sha256=in1pbqU6HktjFwQ5gltrXZVdL3U4k_QstG9rIQgBWzM,4643
+hpack/huffman_table.py,sha256=5yijIuaylIePwkbhWMXKf3EXspeXKY7hakpvPdSRJkg,168580
+hpack/struct.py,sha256=gw7oxmlVzXyzVEV4Y6uvS3wnGrJ0zOljqH-xLpJqWZc,1050
+hpack/table.py,sha256=kC8SweTJIH3jB_qlEJj069jOkEYQGzQHhwELr-RnGkM,9635
diff --git a/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/WHEEL b/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/WHEEL
new file mode 100644
index 0000000..b552003
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/WHEEL
@@ -0,0 +1,5 @@
+Wheel-Version: 1.0
+Generator: bdist_wheel (0.34.2)
+Root-Is-Purelib: true
+Tag: py3-none-any
+
diff --git a/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/top_level.txt b/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/top_level.txt
new file mode 100644
index 0000000..1a0ac48
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hpack-4.0.0.dist-info/top_level.txt
@@ -0,0 +1 @@
+hpack
diff --git a/.venv/lib/python3.9/site-packages/hpack/__init__.py b/.venv/lib/python3.9/site-packages/hpack/__init__.py
new file mode 100644
index 0000000..fc26ac5
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hpack/__init__.py
@@ -0,0 +1,30 @@
+# -*- coding: utf-8 -*-
+"""
+hpack
+~~~~~
+
+HTTP/2 header encoding for Python.
+"""
+from .hpack import Encoder, Decoder
+from .struct import HeaderTuple, NeverIndexedHeaderTuple
+from .exceptions import (
+ HPACKError,
+ HPACKDecodingError,
+ InvalidTableIndex,
+ OversizedHeaderListError,
+ InvalidTableSizeError
+)
+
+__all__ = [
+ 'Encoder',
+ 'Decoder',
+ 'HeaderTuple',
+ 'NeverIndexedHeaderTuple',
+ 'HPACKError',
+ 'HPACKDecodingError',
+ 'InvalidTableIndex',
+ 'OversizedHeaderListError',
+ 'InvalidTableSizeError',
+]
+
+__version__ = '4.0.0'
diff --git a/.venv/lib/python3.9/site-packages/hpack/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hpack/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..a0f9f50
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hpack/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hpack/__pycache__/exceptions.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hpack/__pycache__/exceptions.cpython-39.pyc
new file mode 100644
index 0000000..c07a12d
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hpack/__pycache__/exceptions.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hpack/__pycache__/hpack.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hpack/__pycache__/hpack.cpython-39.pyc
new file mode 100644
index 0000000..119adbe
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hpack/__pycache__/hpack.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hpack/__pycache__/huffman.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hpack/__pycache__/huffman.cpython-39.pyc
new file mode 100644
index 0000000..763509d
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hpack/__pycache__/huffman.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hpack/__pycache__/huffman_constants.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hpack/__pycache__/huffman_constants.cpython-39.pyc
new file mode 100644
index 0000000..1d987e6
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hpack/__pycache__/huffman_constants.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hpack/__pycache__/huffman_table.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hpack/__pycache__/huffman_table.cpython-39.pyc
new file mode 100644
index 0000000..10ef7e7
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hpack/__pycache__/huffman_table.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hpack/__pycache__/struct.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hpack/__pycache__/struct.cpython-39.pyc
new file mode 100644
index 0000000..b714c9d
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hpack/__pycache__/struct.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hpack/__pycache__/table.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hpack/__pycache__/table.cpython-39.pyc
new file mode 100644
index 0000000..f0fc7a2
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hpack/__pycache__/table.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hpack/exceptions.py b/.venv/lib/python3.9/site-packages/hpack/exceptions.py
new file mode 100644
index 0000000..571ba98
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hpack/exceptions.py
@@ -0,0 +1,49 @@
+# -*- coding: utf-8 -*-
+"""
+hyper/http20/exceptions
+~~~~~~~~~~~~~~~~~~~~~~~
+
+This defines exceptions used in the HTTP/2 portion of hyper.
+"""
+
+
+class HPACKError(Exception):
+ """
+ The base class for all ``hpack`` exceptions.
+ """
+ pass
+
+
+class HPACKDecodingError(HPACKError):
+ """
+ An error has been encountered while performing HPACK decoding.
+ """
+ pass
+
+
+class InvalidTableIndex(HPACKDecodingError):
+ """
+ An invalid table index was received.
+ """
+ pass
+
+
+class OversizedHeaderListError(HPACKDecodingError):
+ """
+ A header list that was larger than we allow has been received. This may be
+ a DoS attack.
+
+ .. versionadded:: 2.3.0
+ """
+ pass
+
+
+class InvalidTableSizeError(HPACKDecodingError):
+ """
+ An attempt was made to change the decoder table size to a value larger than
+ allowed, or the list was shrunk and the remote peer didn't shrink their
+ table size.
+
+ .. versionadded:: 3.0.0
+ """
+ pass
diff --git a/.venv/lib/python3.9/site-packages/hpack/hpack.py b/.venv/lib/python3.9/site-packages/hpack/hpack.py
new file mode 100644
index 0000000..cc39bfd
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hpack/hpack.py
@@ -0,0 +1,633 @@
+# -*- coding: utf-8 -*-
+"""
+hpack/hpack
+~~~~~~~~~~~
+
+Implements the HPACK header compression algorithm as detailed by the IETF.
+"""
+import logging
+
+from .table import HeaderTable, table_entry_size
+from .exceptions import (
+ HPACKDecodingError, OversizedHeaderListError, InvalidTableSizeError
+)
+from .huffman import HuffmanEncoder
+from .huffman_constants import (
+ REQUEST_CODES, REQUEST_CODES_LENGTH
+)
+from .huffman_table import decode_huffman
+from .struct import HeaderTuple, NeverIndexedHeaderTuple
+
+log = logging.getLogger(__name__)
+
+INDEX_NONE = b'\x00'
+INDEX_NEVER = b'\x10'
+INDEX_INCREMENTAL = b'\x40'
+
+# Precompute 2^i for 1-8 for use in prefix calcs.
+# Zero index is not used but there to save a subtraction
+# as prefix numbers are not zero indexed.
+_PREFIX_BIT_MAX_NUMBERS = [(2 ** i) - 1 for i in range(9)]
+
+try: # pragma: no cover
+ basestring = basestring
+except NameError: # pragma: no cover
+ basestring = (str, bytes)
+
+
+# We default the maximum header list we're willing to accept to 64kB. That's a
+# lot of headers, but if applications want to raise it they can do.
+DEFAULT_MAX_HEADER_LIST_SIZE = 2 ** 16
+
+
+def _unicode_if_needed(header, raw):
+ """
+ Provides a header as a unicode string if raw is False, otherwise returns
+ it as a bytestring.
+ """
+ name = bytes(header[0])
+ value = bytes(header[1])
+ if not raw:
+ name = name.decode('utf-8')
+ value = value.decode('utf-8')
+ return header.__class__(name, value)
+
+
+def encode_integer(integer, prefix_bits):
+ """
+ This encodes an integer according to the wacky integer encoding rules
+ defined in the HPACK spec.
+ """
+ log.debug("Encoding %d with %d bits", integer, prefix_bits)
+
+ if integer < 0:
+ raise ValueError(
+ "Can only encode positive integers, got %s" % integer
+ )
+
+ if prefix_bits < 1 or prefix_bits > 8:
+ raise ValueError(
+ "Prefix bits must be between 1 and 8, got %s" % prefix_bits
+ )
+
+ max_number = _PREFIX_BIT_MAX_NUMBERS[prefix_bits]
+
+ if integer < max_number:
+ return bytearray([integer]) # Seriously?
+ else:
+ elements = [max_number]
+ integer -= max_number
+
+ while integer >= 128:
+ elements.append((integer & 127) + 128)
+ integer >>= 7
+
+ elements.append(integer)
+
+ return bytearray(elements)
+
+
+def decode_integer(data, prefix_bits):
+ """
+ This decodes an integer according to the wacky integer encoding rules
+ defined in the HPACK spec. Returns a tuple of the decoded integer and the
+ number of bytes that were consumed from ``data`` in order to get that
+ integer.
+ """
+ if prefix_bits < 1 or prefix_bits > 8:
+ raise ValueError(
+ "Prefix bits must be between 1 and 8, got %s" % prefix_bits
+ )
+
+ max_number = _PREFIX_BIT_MAX_NUMBERS[prefix_bits]
+ index = 1
+ shift = 0
+ mask = (0xFF >> (8 - prefix_bits))
+
+ try:
+ number = data[0] & mask
+ if number == max_number:
+ while True:
+ next_byte = data[index]
+ index += 1
+
+ if next_byte >= 128:
+ number += (next_byte - 128) << shift
+ else:
+ number += next_byte << shift
+ break
+ shift += 7
+
+ except IndexError:
+ raise HPACKDecodingError(
+ "Unable to decode HPACK integer representation from %r" % data
+ )
+
+ log.debug("Decoded %d, consumed %d bytes", number, index)
+
+ return number, index
+
+
+def _dict_to_iterable(header_dict):
+ """
+ This converts a dictionary to an iterable of two-tuples. This is a
+ HPACK-specific function because it pulls "special-headers" out first and
+ then emits them.
+ """
+ assert isinstance(header_dict, dict)
+ keys = sorted(
+ header_dict.keys(),
+ key=lambda k: not _to_bytes(k).startswith(b':')
+ )
+ for key in keys:
+ yield key, header_dict[key]
+
+
+def _to_bytes(string):
+ """
+ Convert string to bytes.
+ """
+ if not isinstance(string, basestring): # pragma: no cover
+ string = str(string)
+
+ return string if isinstance(string, bytes) else string.encode('utf-8')
+
+
+class Encoder:
+ """
+ An HPACK encoder object. This object takes HTTP headers and emits encoded
+ HTTP/2 header blocks.
+ """
+
+ def __init__(self):
+ self.header_table = HeaderTable()
+ self.huffman_coder = HuffmanEncoder(
+ REQUEST_CODES, REQUEST_CODES_LENGTH
+ )
+ self.table_size_changes = []
+
+ @property
+ def header_table_size(self):
+ """
+ Controls the size of the HPACK header table.
+ """
+ return self.header_table.maxsize
+
+ @header_table_size.setter
+ def header_table_size(self, value):
+ self.header_table.maxsize = value
+ if self.header_table.resized:
+ self.table_size_changes.append(value)
+
+ def encode(self, headers, huffman=True):
+ """
+ Takes a set of headers and encodes them into a HPACK-encoded header
+ block.
+
+ :param headers: The headers to encode. Must be either an iterable of
+ tuples, an iterable of :class:`HeaderTuple
+ `, or a ``dict``.
+
+ If an iterable of tuples, the tuples may be either
+ two-tuples or three-tuples. If they are two-tuples, the
+ tuples must be of the format ``(name, value)``. If they
+ are three-tuples, they must be of the format
+ ``(name, value, sensitive)``, where ``sensitive`` is a
+ boolean value indicating whether the header should be
+ added to header tables anywhere. If not present,
+ ``sensitive`` defaults to ``False``.
+
+ If an iterable of :class:`HeaderTuple
+ `, the tuples must always be
+ two-tuples. Instead of using ``sensitive`` as a third
+ tuple entry, use :class:`NeverIndexedHeaderTuple
+ ` to request that
+ the field never be indexed.
+
+ .. warning:: HTTP/2 requires that all special headers
+ (headers whose names begin with ``:`` characters)
+ appear at the *start* of the header block. While
+ this method will ensure that happens for ``dict``
+ subclasses, callers using any other iterable of
+ tuples **must** ensure they place their special
+ headers at the start of the iterable.
+
+ For efficiency reasons users should prefer to use
+ iterables of two-tuples: fixing the ordering of
+ dictionary headers is an expensive operation that
+ should be avoided if possible.
+
+ :param huffman: (optional) Whether to Huffman-encode any header sent as
+ a literal value. Except for use when debugging, it is
+ recommended that this be left enabled.
+
+ :returns: A bytestring containing the HPACK-encoded header block.
+ """
+ # Transforming the headers into a header block is a procedure that can
+ # be modeled as a chain or pipe. First, the headers are encoded. This
+ # encoding can be done a number of ways. If the header name-value pair
+ # are already in the header table we can represent them using the
+ # indexed representation: the same is true if they are in the static
+ # table. Otherwise, a literal representation will be used.
+ header_block = []
+
+ # Turn the headers into a list of tuples if possible. This is the
+ # natural way to interact with them in HPACK. Because dictionaries are
+ # un-ordered, we need to make sure we grab the "special" headers first.
+ if isinstance(headers, dict):
+ headers = _dict_to_iterable(headers)
+
+ # Before we begin, if the header table size has been changed we need
+ # to signal all changes since last emission appropriately.
+ if self.header_table.resized:
+ header_block.append(self._encode_table_size_change())
+ self.header_table.resized = False
+
+ # Add each header to the header block
+ for header in headers:
+ sensitive = False
+ if isinstance(header, HeaderTuple):
+ sensitive = not header.indexable
+ elif len(header) > 2:
+ sensitive = header[2]
+
+ header = (_to_bytes(header[0]), _to_bytes(header[1]))
+ header_block.append(self.add(header, sensitive, huffman))
+
+ header_block = b''.join(header_block)
+
+ log.debug("Encoded header block to %s", header_block)
+
+ return header_block
+
+ def add(self, to_add, sensitive, huffman=False):
+ """
+ This function takes a header key-value tuple and serializes it.
+ """
+ log.debug(
+ "Adding %s to the header table, sensitive:%s, huffman:%s",
+ to_add,
+ sensitive,
+ huffman
+ )
+
+ name, value = to_add
+
+ # Set our indexing mode
+ indexbit = INDEX_INCREMENTAL if not sensitive else INDEX_NEVER
+
+ # Search for a matching header in the header table.
+ match = self.header_table.search(name, value)
+
+ if match is None:
+ # Not in the header table. Encode using the literal syntax,
+ # and add it to the header table.
+ encoded = self._encode_literal(name, value, indexbit, huffman)
+ if not sensitive:
+ self.header_table.add(name, value)
+ return encoded
+
+ # The header is in the table, break out the values. If we matched
+ # perfectly, we can use the indexed representation: otherwise we
+ # can use the indexed literal.
+ index, name, perfect = match
+
+ if perfect:
+ # Indexed representation.
+ encoded = self._encode_indexed(index)
+ else:
+ # Indexed literal. We are going to add header to the
+ # header table unconditionally. It is a future todo to
+ # filter out headers which are known to be ineffective for
+ # indexing since they just take space in the table and
+ # pushed out other valuable headers.
+ encoded = self._encode_indexed_literal(
+ index, value, indexbit, huffman
+ )
+ if not sensitive:
+ self.header_table.add(name, value)
+
+ return encoded
+
+ def _encode_indexed(self, index):
+ """
+ Encodes a header using the indexed representation.
+ """
+ field = encode_integer(index, 7)
+ field[0] |= 0x80 # we set the top bit
+ return bytes(field)
+
+ def _encode_literal(self, name, value, indexbit, huffman=False):
+ """
+ Encodes a header with a literal name and literal value. If ``indexing``
+ is True, the header will be added to the header table: otherwise it
+ will not.
+ """
+ if huffman:
+ name = self.huffman_coder.encode(name)
+ value = self.huffman_coder.encode(value)
+
+ name_len = encode_integer(len(name), 7)
+ value_len = encode_integer(len(value), 7)
+
+ if huffman:
+ name_len[0] |= 0x80
+ value_len[0] |= 0x80
+
+ return b''.join(
+ [indexbit, bytes(name_len), name, bytes(value_len), value]
+ )
+
+ def _encode_indexed_literal(self, index, value, indexbit, huffman=False):
+ """
+ Encodes a header with an indexed name and a literal value and performs
+ incremental indexing.
+ """
+ if indexbit != INDEX_INCREMENTAL:
+ prefix = encode_integer(index, 4)
+ else:
+ prefix = encode_integer(index, 6)
+
+ prefix[0] |= ord(indexbit)
+
+ if huffman:
+ value = self.huffman_coder.encode(value)
+
+ value_len = encode_integer(len(value), 7)
+
+ if huffman:
+ value_len[0] |= 0x80
+
+ return b''.join([bytes(prefix), bytes(value_len), value])
+
+ def _encode_table_size_change(self):
+ """
+ Produces the encoded form of all header table size change context
+ updates.
+ """
+ block = b''
+ for size_bytes in self.table_size_changes:
+ size_bytes = encode_integer(size_bytes, 5)
+ size_bytes[0] |= 0x20
+ block += bytes(size_bytes)
+ self.table_size_changes = []
+ return block
+
+
+class Decoder:
+ """
+ An HPACK decoder object.
+
+ .. versionchanged:: 2.3.0
+ Added ``max_header_list_size`` argument.
+
+ :param max_header_list_size: The maximum decompressed size we will allow
+ for any single header block. This is a protection against DoS attacks
+ that attempt to force the application to expand a relatively small
+ amount of data into a really large header list, allowing enormous
+ amounts of memory to be allocated.
+
+ If this amount of data is exceeded, a `OversizedHeaderListError
+ ` exception will be raised. At this
+ point the connection should be shut down, as the HPACK state will no
+ longer be usable.
+
+ Defaults to 64kB.
+ :type max_header_list_size: ``int``
+ """
+ def __init__(self, max_header_list_size=DEFAULT_MAX_HEADER_LIST_SIZE):
+ self.header_table = HeaderTable()
+
+ #: The maximum decompressed size we will allow for any single header
+ #: block. This is a protection against DoS attacks that attempt to
+ #: force the application to expand a relatively small amount of data
+ #: into a really large header list, allowing enormous amounts of memory
+ #: to be allocated.
+ #:
+ #: If this amount of data is exceeded, a `OversizedHeaderListError
+ #: ` exception will be raised. At this
+ #: point the connection should be shut down, as the HPACK state will no
+ #: longer be usable.
+ #:
+ #: Defaults to 64kB.
+ #:
+ #: .. versionadded:: 2.3.0
+ self.max_header_list_size = max_header_list_size
+
+ #: Maximum allowed header table size.
+ #:
+ #: A HTTP/2 implementation should set this to the most recent value of
+ #: SETTINGS_HEADER_TABLE_SIZE that it sent *and has received an ACK
+ #: for*. Once this setting is set, the actual header table size will be
+ #: checked at the end of each decoding run and whenever it is changed,
+ #: to confirm that it fits in this size.
+ self.max_allowed_table_size = self.header_table.maxsize
+
+ @property
+ def header_table_size(self):
+ """
+ Controls the size of the HPACK header table.
+ """
+ return self.header_table.maxsize
+
+ @header_table_size.setter
+ def header_table_size(self, value):
+ self.header_table.maxsize = value
+
+ def decode(self, data, raw=False):
+ """
+ Takes an HPACK-encoded header block and decodes it into a header set.
+
+ :param data: A bytestring representing a complete HPACK-encoded header
+ block.
+ :param raw: (optional) Whether to return the headers as tuples of raw
+ byte strings or to decode them as UTF-8 before returning
+ them. The default value is False, which returns tuples of
+ Unicode strings
+ :returns: A list of two-tuples of ``(name, value)`` representing the
+ HPACK-encoded headers, in the order they were decoded.
+ :raises HPACKDecodingError: If an error is encountered while decoding
+ the header block.
+ """
+ log.debug("Decoding %s", data)
+
+ data_mem = memoryview(data)
+ headers = []
+ data_len = len(data)
+ inflated_size = 0
+ current_index = 0
+
+ while current_index < data_len:
+ # Work out what kind of header we're decoding.
+ # If the high bit is 1, it's an indexed field.
+ current = data[current_index]
+ indexed = True if current & 0x80 else False
+
+ # Otherwise, if the second-highest bit is 1 it's a field that does
+ # alter the header table.
+ literal_index = True if current & 0x40 else False
+
+ # Otherwise, if the third-highest bit is 1 it's an encoding context
+ # update.
+ encoding_update = True if current & 0x20 else False
+
+ if indexed:
+ header, consumed = self._decode_indexed(
+ data_mem[current_index:]
+ )
+ elif literal_index:
+ # It's a literal header that does affect the header table.
+ header, consumed = self._decode_literal_index(
+ data_mem[current_index:]
+ )
+ elif encoding_update:
+ # It's an update to the encoding context. These are forbidden
+ # in a header block after any actual header.
+ if headers:
+ raise HPACKDecodingError(
+ "Table size update not at the start of the block"
+ )
+ consumed = self._update_encoding_context(
+ data_mem[current_index:]
+ )
+ header = None
+ else:
+ # It's a literal header that does not affect the header table.
+ header, consumed = self._decode_literal_no_index(
+ data_mem[current_index:]
+ )
+
+ if header:
+ headers.append(header)
+ inflated_size += table_entry_size(*header)
+
+ if inflated_size > self.max_header_list_size:
+ raise OversizedHeaderListError(
+ "A header list larger than %d has been received" %
+ self.max_header_list_size
+ )
+
+ current_index += consumed
+
+ # Confirm that the table size is lower than the maximum. We do this
+ # here to ensure that we catch when the max has been *shrunk* and the
+ # remote peer hasn't actually done that.
+ self._assert_valid_table_size()
+
+ try:
+ return [_unicode_if_needed(h, raw) for h in headers]
+ except UnicodeDecodeError:
+ raise HPACKDecodingError("Unable to decode headers as UTF-8.")
+
+ def _assert_valid_table_size(self):
+ """
+ Check that the table size set by the encoder is lower than the maximum
+ we expect to have.
+ """
+ if self.header_table_size > self.max_allowed_table_size:
+ raise InvalidTableSizeError(
+ "Encoder did not shrink table size to within the max"
+ )
+
+ def _update_encoding_context(self, data):
+ """
+ Handles a byte that updates the encoding context.
+ """
+ # We've been asked to resize the header table.
+ new_size, consumed = decode_integer(data, 5)
+ if new_size > self.max_allowed_table_size:
+ raise InvalidTableSizeError(
+ "Encoder exceeded max allowable table size"
+ )
+ self.header_table_size = new_size
+ return consumed
+
+ def _decode_indexed(self, data):
+ """
+ Decodes a header represented using the indexed representation.
+ """
+ index, consumed = decode_integer(data, 7)
+ header = HeaderTuple(*self.header_table.get_by_index(index))
+ log.debug("Decoded %s, consumed %d", header, consumed)
+ return header, consumed
+
+ def _decode_literal_no_index(self, data):
+ return self._decode_literal(data, False)
+
+ def _decode_literal_index(self, data):
+ return self._decode_literal(data, True)
+
+ def _decode_literal(self, data, should_index):
+ """
+ Decodes a header represented with a literal.
+ """
+ total_consumed = 0
+
+ # When should_index is true, if the low six bits of the first byte are
+ # nonzero, the header name is indexed.
+ # When should_index is false, if the low four bits of the first byte
+ # are nonzero the header name is indexed.
+ if should_index:
+ indexed_name = data[0] & 0x3F
+ name_len = 6
+ not_indexable = False
+ else:
+ high_byte = data[0]
+ indexed_name = high_byte & 0x0F
+ name_len = 4
+ not_indexable = high_byte & 0x10
+
+ if indexed_name:
+ # Indexed header name.
+ index, consumed = decode_integer(data, name_len)
+ name = self.header_table.get_by_index(index)[0]
+
+ total_consumed = consumed
+ length = 0
+ else:
+ # Literal header name. The first byte was consumed, so we need to
+ # move forward.
+ data = data[1:]
+
+ length, consumed = decode_integer(data, 7)
+ name = data[consumed:consumed + length]
+ if len(name) != length:
+ raise HPACKDecodingError("Truncated header block")
+
+ if data[0] & 0x80:
+ name = decode_huffman(name)
+ total_consumed = consumed + length + 1 # Since we moved forward 1.
+
+ data = data[consumed + length:]
+
+ # The header value is definitely length-based.
+ length, consumed = decode_integer(data, 7)
+ value = data[consumed:consumed + length]
+ if len(value) != length:
+ raise HPACKDecodingError("Truncated header block")
+
+ if data[0] & 0x80:
+ value = decode_huffman(value)
+
+ # Updated the total consumed length.
+ total_consumed += length + consumed
+
+ # If we have been told never to index the header field, encode that in
+ # the tuple we use.
+ if not_indexable:
+ header = NeverIndexedHeaderTuple(name, value)
+ else:
+ header = HeaderTuple(name, value)
+
+ # If we've been asked to index this, add it to the header table.
+ if should_index:
+ self.header_table.add(name, value)
+
+ log.debug(
+ "Decoded %s, total consumed %d bytes, indexed %s",
+ header,
+ total_consumed,
+ should_index
+ )
+
+ return header, total_consumed
diff --git a/.venv/lib/python3.9/site-packages/hpack/huffman.py b/.venv/lib/python3.9/site-packages/hpack/huffman.py
new file mode 100644
index 0000000..595d69b
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hpack/huffman.py
@@ -0,0 +1,66 @@
+# -*- coding: utf-8 -*-
+"""
+hpack/huffman_decoder
+~~~~~~~~~~~~~~~~~~~~~
+
+An implementation of a bitwise prefix tree specially built for decoding
+Huffman-coded content where we already know the Huffman table.
+"""
+
+
+class HuffmanEncoder:
+ """
+ Encodes a string according to the Huffman encoding table defined in the
+ HPACK specification.
+ """
+ def __init__(self, huffman_code_list, huffman_code_list_lengths):
+ self.huffman_code_list = huffman_code_list
+ self.huffman_code_list_lengths = huffman_code_list_lengths
+
+ def encode(self, bytes_to_encode):
+ """
+ Given a string of bytes, encodes them according to the HPACK Huffman
+ specification.
+ """
+ # If handed the empty string, just immediately return.
+ if not bytes_to_encode:
+ return b''
+
+ final_num = 0
+ final_int_len = 0
+
+ # Turn each byte into its huffman code. These codes aren't necessarily
+ # octet aligned, so keep track of how far through an octet we are. To
+ # handle this cleanly, just use a single giant integer.
+ for byte in bytes_to_encode:
+ bin_int_len = self.huffman_code_list_lengths[byte]
+ bin_int = self.huffman_code_list[byte] & (
+ 2 ** (bin_int_len + 1) - 1
+ )
+ final_num <<= bin_int_len
+ final_num |= bin_int
+ final_int_len += bin_int_len
+
+ # Pad out to an octet with ones.
+ bits_to_be_padded = (8 - (final_int_len % 8)) % 8
+ final_num <<= bits_to_be_padded
+ final_num |= (1 << bits_to_be_padded) - 1
+
+ # Convert the number to hex and strip off the leading '0x' and the
+ # trailing 'L', if present.
+ final_num = hex(final_num)[2:].rstrip('L')
+
+ # If this is odd, prepend a zero.
+ final_num = '0' + final_num if len(final_num) % 2 != 0 else final_num
+
+ # This number should have twice as many digits as bytes. If not, we're
+ # missing some leading zeroes. Work out how many bytes we want and how
+ # many digits we have, then add the missing zero digits to the front.
+ total_bytes = (final_int_len + bits_to_be_padded) // 8
+ expected_digits = total_bytes * 2
+
+ if len(final_num) != expected_digits:
+ missing_digits = expected_digits - len(final_num)
+ final_num = ('0' * missing_digits) + final_num
+
+ return bytes.fromhex(final_num)
diff --git a/.venv/lib/python3.9/site-packages/hpack/huffman_constants.py b/.venv/lib/python3.9/site-packages/hpack/huffman_constants.py
new file mode 100644
index 0000000..4caf012
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hpack/huffman_constants.py
@@ -0,0 +1,289 @@
+# -*- coding: utf-8 -*-
+"""
+hpack/huffman_constants
+~~~~~~~~~~~~~~~~~~~~~~~
+
+Defines the constant Huffman table. This takes up an upsetting amount of space,
+but c'est la vie.
+"""
+# flake8: noqa
+
+REQUEST_CODES = [
+ 0x1ff8,
+ 0x7fffd8,
+ 0xfffffe2,
+ 0xfffffe3,
+ 0xfffffe4,
+ 0xfffffe5,
+ 0xfffffe6,
+ 0xfffffe7,
+ 0xfffffe8,
+ 0xffffea,
+ 0x3ffffffc,
+ 0xfffffe9,
+ 0xfffffea,
+ 0x3ffffffd,
+ 0xfffffeb,
+ 0xfffffec,
+ 0xfffffed,
+ 0xfffffee,
+ 0xfffffef,
+ 0xffffff0,
+ 0xffffff1,
+ 0xffffff2,
+ 0x3ffffffe,
+ 0xffffff3,
+ 0xffffff4,
+ 0xffffff5,
+ 0xffffff6,
+ 0xffffff7,
+ 0xffffff8,
+ 0xffffff9,
+ 0xffffffa,
+ 0xffffffb,
+ 0x14,
+ 0x3f8,
+ 0x3f9,
+ 0xffa,
+ 0x1ff9,
+ 0x15,
+ 0xf8,
+ 0x7fa,
+ 0x3fa,
+ 0x3fb,
+ 0xf9,
+ 0x7fb,
+ 0xfa,
+ 0x16,
+ 0x17,
+ 0x18,
+ 0x0,
+ 0x1,
+ 0x2,
+ 0x19,
+ 0x1a,
+ 0x1b,
+ 0x1c,
+ 0x1d,
+ 0x1e,
+ 0x1f,
+ 0x5c,
+ 0xfb,
+ 0x7ffc,
+ 0x20,
+ 0xffb,
+ 0x3fc,
+ 0x1ffa,
+ 0x21,
+ 0x5d,
+ 0x5e,
+ 0x5f,
+ 0x60,
+ 0x61,
+ 0x62,
+ 0x63,
+ 0x64,
+ 0x65,
+ 0x66,
+ 0x67,
+ 0x68,
+ 0x69,
+ 0x6a,
+ 0x6b,
+ 0x6c,
+ 0x6d,
+ 0x6e,
+ 0x6f,
+ 0x70,
+ 0x71,
+ 0x72,
+ 0xfc,
+ 0x73,
+ 0xfd,
+ 0x1ffb,
+ 0x7fff0,
+ 0x1ffc,
+ 0x3ffc,
+ 0x22,
+ 0x7ffd,
+ 0x3,
+ 0x23,
+ 0x4,
+ 0x24,
+ 0x5,
+ 0x25,
+ 0x26,
+ 0x27,
+ 0x6,
+ 0x74,
+ 0x75,
+ 0x28,
+ 0x29,
+ 0x2a,
+ 0x7,
+ 0x2b,
+ 0x76,
+ 0x2c,
+ 0x8,
+ 0x9,
+ 0x2d,
+ 0x77,
+ 0x78,
+ 0x79,
+ 0x7a,
+ 0x7b,
+ 0x7ffe,
+ 0x7fc,
+ 0x3ffd,
+ 0x1ffd,
+ 0xffffffc,
+ 0xfffe6,
+ 0x3fffd2,
+ 0xfffe7,
+ 0xfffe8,
+ 0x3fffd3,
+ 0x3fffd4,
+ 0x3fffd5,
+ 0x7fffd9,
+ 0x3fffd6,
+ 0x7fffda,
+ 0x7fffdb,
+ 0x7fffdc,
+ 0x7fffdd,
+ 0x7fffde,
+ 0xffffeb,
+ 0x7fffdf,
+ 0xffffec,
+ 0xffffed,
+ 0x3fffd7,
+ 0x7fffe0,
+ 0xffffee,
+ 0x7fffe1,
+ 0x7fffe2,
+ 0x7fffe3,
+ 0x7fffe4,
+ 0x1fffdc,
+ 0x3fffd8,
+ 0x7fffe5,
+ 0x3fffd9,
+ 0x7fffe6,
+ 0x7fffe7,
+ 0xffffef,
+ 0x3fffda,
+ 0x1fffdd,
+ 0xfffe9,
+ 0x3fffdb,
+ 0x3fffdc,
+ 0x7fffe8,
+ 0x7fffe9,
+ 0x1fffde,
+ 0x7fffea,
+ 0x3fffdd,
+ 0x3fffde,
+ 0xfffff0,
+ 0x1fffdf,
+ 0x3fffdf,
+ 0x7fffeb,
+ 0x7fffec,
+ 0x1fffe0,
+ 0x1fffe1,
+ 0x3fffe0,
+ 0x1fffe2,
+ 0x7fffed,
+ 0x3fffe1,
+ 0x7fffee,
+ 0x7fffef,
+ 0xfffea,
+ 0x3fffe2,
+ 0x3fffe3,
+ 0x3fffe4,
+ 0x7ffff0,
+ 0x3fffe5,
+ 0x3fffe6,
+ 0x7ffff1,
+ 0x3ffffe0,
+ 0x3ffffe1,
+ 0xfffeb,
+ 0x7fff1,
+ 0x3fffe7,
+ 0x7ffff2,
+ 0x3fffe8,
+ 0x1ffffec,
+ 0x3ffffe2,
+ 0x3ffffe3,
+ 0x3ffffe4,
+ 0x7ffffde,
+ 0x7ffffdf,
+ 0x3ffffe5,
+ 0xfffff1,
+ 0x1ffffed,
+ 0x7fff2,
+ 0x1fffe3,
+ 0x3ffffe6,
+ 0x7ffffe0,
+ 0x7ffffe1,
+ 0x3ffffe7,
+ 0x7ffffe2,
+ 0xfffff2,
+ 0x1fffe4,
+ 0x1fffe5,
+ 0x3ffffe8,
+ 0x3ffffe9,
+ 0xffffffd,
+ 0x7ffffe3,
+ 0x7ffffe4,
+ 0x7ffffe5,
+ 0xfffec,
+ 0xfffff3,
+ 0xfffed,
+ 0x1fffe6,
+ 0x3fffe9,
+ 0x1fffe7,
+ 0x1fffe8,
+ 0x7ffff3,
+ 0x3fffea,
+ 0x3fffeb,
+ 0x1ffffee,
+ 0x1ffffef,
+ 0xfffff4,
+ 0xfffff5,
+ 0x3ffffea,
+ 0x7ffff4,
+ 0x3ffffeb,
+ 0x7ffffe6,
+ 0x3ffffec,
+ 0x3ffffed,
+ 0x7ffffe7,
+ 0x7ffffe8,
+ 0x7ffffe9,
+ 0x7ffffea,
+ 0x7ffffeb,
+ 0xffffffe,
+ 0x7ffffec,
+ 0x7ffffed,
+ 0x7ffffee,
+ 0x7ffffef,
+ 0x7fffff0,
+ 0x3ffffee,
+ 0x3fffffff,
+]
+
+REQUEST_CODES_LENGTH = [
+ 13, 23, 28, 28, 28, 28, 28, 28, 28, 24, 30, 28, 28, 30, 28, 28,
+ 28, 28, 28, 28, 28, 28, 30, 28, 28, 28, 28, 28, 28, 28, 28, 28,
+ 6, 10, 10, 12, 13, 6, 8, 11, 10, 10, 8, 11, 8, 6, 6, 6,
+ 5, 5, 5, 6, 6, 6, 6, 6, 6, 6, 7, 8, 15, 6, 12, 10,
+ 13, 6, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7,
+ 7, 7, 7, 7, 7, 7, 7, 7, 8, 7, 8, 13, 19, 13, 14, 6,
+ 15, 5, 6, 5, 6, 5, 6, 6, 6, 5, 7, 7, 6, 6, 6, 5,
+ 6, 7, 6, 5, 5, 6, 7, 7, 7, 7, 7, 15, 11, 14, 13, 28,
+ 20, 22, 20, 20, 22, 22, 22, 23, 22, 23, 23, 23, 23, 23, 24, 23,
+ 24, 24, 22, 23, 24, 23, 23, 23, 23, 21, 22, 23, 22, 23, 23, 24,
+ 22, 21, 20, 22, 22, 23, 23, 21, 23, 22, 22, 24, 21, 22, 23, 23,
+ 21, 21, 22, 21, 23, 22, 23, 23, 20, 22, 22, 22, 23, 22, 22, 23,
+ 26, 26, 20, 19, 22, 23, 22, 25, 26, 26, 26, 27, 27, 26, 24, 25,
+ 19, 21, 26, 27, 27, 26, 27, 24, 21, 21, 26, 26, 28, 27, 27, 27,
+ 20, 24, 20, 21, 22, 21, 21, 23, 22, 22, 25, 25, 24, 24, 26, 23,
+ 26, 27, 26, 26, 27, 27, 27, 27, 27, 28, 27, 27, 27, 27, 27, 26,
+ 30,
+]
diff --git a/.venv/lib/python3.9/site-packages/hpack/huffman_table.py b/.venv/lib/python3.9/site-packages/hpack/huffman_table.py
new file mode 100644
index 0000000..c199ef5
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hpack/huffman_table.py
@@ -0,0 +1,4739 @@
+# -*- coding: utf-8 -*-
+"""
+hpack/huffman_table
+~~~~~~~~~~~~~~~~~~~
+
+This implementation of a Huffman decoding table for HTTP/2 is essentially a
+Python port of the work originally done for nghttp2's Huffman decoding. For
+this reason, while this file is made available under the MIT license as is the
+rest of this module, this file is undoubtedly a derivative work of the nghttp2
+file ``nghttp2_hd_huffman_data.c``, obtained from
+https://github.com/tatsuhiro-t/nghttp2/ at commit
+d2b55ad1a245e1d1964579fa3fac36ebf3939e72. That work is made available under
+the Apache 2.0 license under the following terms:
+
+ Copyright (c) 2013 Tatsuhiro Tsujikawa
+
+ Permission is hereby granted, free of charge, to any person obtaining
+ a copy of this software and associated documentation files (the
+ "Software"), to deal in the Software without restriction, including
+ without limitation the rights to use, copy, modify, merge, publish,
+ distribute, sublicense, and/or sell copies of the Software, and to
+ permit persons to whom the Software is furnished to do so, subject to
+ the following conditions:
+
+ The above copyright notice and this permission notice shall be
+ included in all copies or substantial portions of the Software.
+
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+ EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+ MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
+ NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
+ LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
+ OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
+ WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+The essence of this approach is that it builds a finite state machine out of
+4-bit nibbles of Huffman coded data. The input function passes 4 bits worth of
+data to the state machine each time, which uses those 4 bits of data along with
+the current accumulated state data to process the data given.
+
+For the sake of efficiency, the in-memory representation of the states,
+transitions, and result values of the state machine are represented as a long
+list containing three-tuples. This list is enormously long, and viewing it as
+an in-memory representation is not very clear, but it is laid out here in a way
+that is intended to be *somewhat* more clear.
+
+Essentially, the list is structured as 256 collections of 16 entries (one for
+each nibble) of three-tuples. Each collection is called a "node", and the
+zeroth collection is called the "root node". The state machine tracks one
+value: the "state" byte.
+
+For each nibble passed to the state machine, it first multiplies the "state"
+byte by 16 and adds the numerical value of the nibble. This number is the index
+into the large flat list.
+
+The three-tuple that is found by looking up that index consists of three
+values:
+
+- a new state value, used for subsequent decoding
+- a collection of flags, used to determine whether data is emitted or whether
+ the state machine is complete.
+- the byte value to emit, assuming that emitting a byte is required.
+
+The flags are consulted, if necessary a byte is emitted, and then the next
+nibble is used. This continues until the state machine believes it has
+completely Huffman-decoded the data.
+
+This approach has relatively little indirection, and therefore performs
+relatively well, particularly on implementations like PyPy where the cost of
+loops at the Python-level is not too expensive. The total number of loop
+iterations is 4x the number of bytes passed to the decoder.
+"""
+from .exceptions import HPACKDecodingError
+
+
+# This defines the state machine "class" at the top of the file. The reason we
+# do this is to keep the terrifing monster state table at the *bottom* of the
+# file so you don't have to actually *look* at the damn thing.
+def decode_huffman(huffman_string):
+ """
+ Given a bytestring of Huffman-encoded data for HPACK, returns a bytestring
+ of the decompressed data.
+ """
+ if not huffman_string:
+ return b''
+
+ state = 0
+ flags = 0
+ decoded_bytes = bytearray()
+
+ # Perversely, bytearrays are a lot more convenient across Python 2 and
+ # Python 3 because they behave *the same way* on both platforms. Given that
+ # we really do want numerical bytes when we iterate here, let's use a
+ # bytearray.
+ huffman_string = bytearray(huffman_string)
+
+ # This loop is unrolled somewhat. Because we use a nibble, not a byte, we
+ # need to handle each nibble twice. We unroll that: it makes the loop body
+ # a bit longer, but that's ok.
+ for input_byte in huffman_string:
+ index = (state * 16) + (input_byte >> 4)
+ state, flags, output_byte = HUFFMAN_TABLE[index]
+
+ if flags & HUFFMAN_FAIL:
+ raise HPACKDecodingError("Invalid Huffman String")
+
+ if flags & HUFFMAN_EMIT_SYMBOL:
+ decoded_bytes.append(output_byte)
+
+ index = (state * 16) + (input_byte & 0x0F)
+ state, flags, output_byte = HUFFMAN_TABLE[index]
+
+ if flags & HUFFMAN_FAIL:
+ raise HPACKDecodingError("Invalid Huffman String")
+
+ if flags & HUFFMAN_EMIT_SYMBOL:
+ decoded_bytes.append(output_byte)
+
+ if not (flags & HUFFMAN_COMPLETE):
+ raise HPACKDecodingError("Incomplete Huffman string")
+
+ return bytes(decoded_bytes)
+
+
+# Some decoder flags to control state transitions.
+HUFFMAN_COMPLETE = 1
+HUFFMAN_EMIT_SYMBOL = (1 << 1)
+HUFFMAN_FAIL = (1 << 2)
+
+# This is the monster table. Avert your eyes, children.
+HUFFMAN_TABLE = [
+ # Node 0 (Root Node, never emits symbols.)
+ (4, 0, 0),
+ (5, 0, 0),
+ (7, 0, 0),
+ (8, 0, 0),
+ (11, 0, 0),
+ (12, 0, 0),
+ (16, 0, 0),
+ (19, 0, 0),
+ (25, 0, 0),
+ (28, 0, 0),
+ (32, 0, 0),
+ (35, 0, 0),
+ (42, 0, 0),
+ (49, 0, 0),
+ (57, 0, 0),
+ (64, HUFFMAN_COMPLETE, 0),
+
+ # Node 1
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 48),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 49),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 50),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 97),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 99),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 101),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 105),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 111),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 115),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 116),
+ (13, 0, 0),
+ (14, 0, 0),
+ (17, 0, 0),
+ (18, 0, 0),
+ (20, 0, 0),
+ (21, 0, 0),
+
+ # Node 2
+ (1, HUFFMAN_EMIT_SYMBOL, 48),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 48),
+ (1, HUFFMAN_EMIT_SYMBOL, 49),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 49),
+ (1, HUFFMAN_EMIT_SYMBOL, 50),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 50),
+ (1, HUFFMAN_EMIT_SYMBOL, 97),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 97),
+ (1, HUFFMAN_EMIT_SYMBOL, 99),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 99),
+ (1, HUFFMAN_EMIT_SYMBOL, 101),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 101),
+ (1, HUFFMAN_EMIT_SYMBOL, 105),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 105),
+ (1, HUFFMAN_EMIT_SYMBOL, 111),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 111),
+
+ # Node 3
+ (2, HUFFMAN_EMIT_SYMBOL, 48),
+ (9, HUFFMAN_EMIT_SYMBOL, 48),
+ (23, HUFFMAN_EMIT_SYMBOL, 48),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 48),
+ (2, HUFFMAN_EMIT_SYMBOL, 49),
+ (9, HUFFMAN_EMIT_SYMBOL, 49),
+ (23, HUFFMAN_EMIT_SYMBOL, 49),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 49),
+ (2, HUFFMAN_EMIT_SYMBOL, 50),
+ (9, HUFFMAN_EMIT_SYMBOL, 50),
+ (23, HUFFMAN_EMIT_SYMBOL, 50),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 50),
+ (2, HUFFMAN_EMIT_SYMBOL, 97),
+ (9, HUFFMAN_EMIT_SYMBOL, 97),
+ (23, HUFFMAN_EMIT_SYMBOL, 97),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 97),
+
+ # Node 4
+ (3, HUFFMAN_EMIT_SYMBOL, 48),
+ (6, HUFFMAN_EMIT_SYMBOL, 48),
+ (10, HUFFMAN_EMIT_SYMBOL, 48),
+ (15, HUFFMAN_EMIT_SYMBOL, 48),
+ (24, HUFFMAN_EMIT_SYMBOL, 48),
+ (31, HUFFMAN_EMIT_SYMBOL, 48),
+ (41, HUFFMAN_EMIT_SYMBOL, 48),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 48),
+ (3, HUFFMAN_EMIT_SYMBOL, 49),
+ (6, HUFFMAN_EMIT_SYMBOL, 49),
+ (10, HUFFMAN_EMIT_SYMBOL, 49),
+ (15, HUFFMAN_EMIT_SYMBOL, 49),
+ (24, HUFFMAN_EMIT_SYMBOL, 49),
+ (31, HUFFMAN_EMIT_SYMBOL, 49),
+ (41, HUFFMAN_EMIT_SYMBOL, 49),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 49),
+
+ # Node 5
+ (3, HUFFMAN_EMIT_SYMBOL, 50),
+ (6, HUFFMAN_EMIT_SYMBOL, 50),
+ (10, HUFFMAN_EMIT_SYMBOL, 50),
+ (15, HUFFMAN_EMIT_SYMBOL, 50),
+ (24, HUFFMAN_EMIT_SYMBOL, 50),
+ (31, HUFFMAN_EMIT_SYMBOL, 50),
+ (41, HUFFMAN_EMIT_SYMBOL, 50),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 50),
+ (3, HUFFMAN_EMIT_SYMBOL, 97),
+ (6, HUFFMAN_EMIT_SYMBOL, 97),
+ (10, HUFFMAN_EMIT_SYMBOL, 97),
+ (15, HUFFMAN_EMIT_SYMBOL, 97),
+ (24, HUFFMAN_EMIT_SYMBOL, 97),
+ (31, HUFFMAN_EMIT_SYMBOL, 97),
+ (41, HUFFMAN_EMIT_SYMBOL, 97),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 97),
+
+ # Node 6
+ (2, HUFFMAN_EMIT_SYMBOL, 99),
+ (9, HUFFMAN_EMIT_SYMBOL, 99),
+ (23, HUFFMAN_EMIT_SYMBOL, 99),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 99),
+ (2, HUFFMAN_EMIT_SYMBOL, 101),
+ (9, HUFFMAN_EMIT_SYMBOL, 101),
+ (23, HUFFMAN_EMIT_SYMBOL, 101),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 101),
+ (2, HUFFMAN_EMIT_SYMBOL, 105),
+ (9, HUFFMAN_EMIT_SYMBOL, 105),
+ (23, HUFFMAN_EMIT_SYMBOL, 105),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 105),
+ (2, HUFFMAN_EMIT_SYMBOL, 111),
+ (9, HUFFMAN_EMIT_SYMBOL, 111),
+ (23, HUFFMAN_EMIT_SYMBOL, 111),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 111),
+
+ # Node 7
+ (3, HUFFMAN_EMIT_SYMBOL, 99),
+ (6, HUFFMAN_EMIT_SYMBOL, 99),
+ (10, HUFFMAN_EMIT_SYMBOL, 99),
+ (15, HUFFMAN_EMIT_SYMBOL, 99),
+ (24, HUFFMAN_EMIT_SYMBOL, 99),
+ (31, HUFFMAN_EMIT_SYMBOL, 99),
+ (41, HUFFMAN_EMIT_SYMBOL, 99),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 99),
+ (3, HUFFMAN_EMIT_SYMBOL, 101),
+ (6, HUFFMAN_EMIT_SYMBOL, 101),
+ (10, HUFFMAN_EMIT_SYMBOL, 101),
+ (15, HUFFMAN_EMIT_SYMBOL, 101),
+ (24, HUFFMAN_EMIT_SYMBOL, 101),
+ (31, HUFFMAN_EMIT_SYMBOL, 101),
+ (41, HUFFMAN_EMIT_SYMBOL, 101),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 101),
+
+ # Node 8
+ (3, HUFFMAN_EMIT_SYMBOL, 105),
+ (6, HUFFMAN_EMIT_SYMBOL, 105),
+ (10, HUFFMAN_EMIT_SYMBOL, 105),
+ (15, HUFFMAN_EMIT_SYMBOL, 105),
+ (24, HUFFMAN_EMIT_SYMBOL, 105),
+ (31, HUFFMAN_EMIT_SYMBOL, 105),
+ (41, HUFFMAN_EMIT_SYMBOL, 105),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 105),
+ (3, HUFFMAN_EMIT_SYMBOL, 111),
+ (6, HUFFMAN_EMIT_SYMBOL, 111),
+ (10, HUFFMAN_EMIT_SYMBOL, 111),
+ (15, HUFFMAN_EMIT_SYMBOL, 111),
+ (24, HUFFMAN_EMIT_SYMBOL, 111),
+ (31, HUFFMAN_EMIT_SYMBOL, 111),
+ (41, HUFFMAN_EMIT_SYMBOL, 111),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 111),
+
+ # Node 9
+ (1, HUFFMAN_EMIT_SYMBOL, 115),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 115),
+ (1, HUFFMAN_EMIT_SYMBOL, 116),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 116),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 32),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 37),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 45),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 46),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 47),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 51),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 52),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 53),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 54),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 55),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 56),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 57),
+
+ # Node 10
+ (2, HUFFMAN_EMIT_SYMBOL, 115),
+ (9, HUFFMAN_EMIT_SYMBOL, 115),
+ (23, HUFFMAN_EMIT_SYMBOL, 115),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 115),
+ (2, HUFFMAN_EMIT_SYMBOL, 116),
+ (9, HUFFMAN_EMIT_SYMBOL, 116),
+ (23, HUFFMAN_EMIT_SYMBOL, 116),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 116),
+ (1, HUFFMAN_EMIT_SYMBOL, 32),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 32),
+ (1, HUFFMAN_EMIT_SYMBOL, 37),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 37),
+ (1, HUFFMAN_EMIT_SYMBOL, 45),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 45),
+ (1, HUFFMAN_EMIT_SYMBOL, 46),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 46),
+
+ # Node 11
+ (3, HUFFMAN_EMIT_SYMBOL, 115),
+ (6, HUFFMAN_EMIT_SYMBOL, 115),
+ (10, HUFFMAN_EMIT_SYMBOL, 115),
+ (15, HUFFMAN_EMIT_SYMBOL, 115),
+ (24, HUFFMAN_EMIT_SYMBOL, 115),
+ (31, HUFFMAN_EMIT_SYMBOL, 115),
+ (41, HUFFMAN_EMIT_SYMBOL, 115),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 115),
+ (3, HUFFMAN_EMIT_SYMBOL, 116),
+ (6, HUFFMAN_EMIT_SYMBOL, 116),
+ (10, HUFFMAN_EMIT_SYMBOL, 116),
+ (15, HUFFMAN_EMIT_SYMBOL, 116),
+ (24, HUFFMAN_EMIT_SYMBOL, 116),
+ (31, HUFFMAN_EMIT_SYMBOL, 116),
+ (41, HUFFMAN_EMIT_SYMBOL, 116),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 116),
+
+ # Node 12
+ (2, HUFFMAN_EMIT_SYMBOL, 32),
+ (9, HUFFMAN_EMIT_SYMBOL, 32),
+ (23, HUFFMAN_EMIT_SYMBOL, 32),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 32),
+ (2, HUFFMAN_EMIT_SYMBOL, 37),
+ (9, HUFFMAN_EMIT_SYMBOL, 37),
+ (23, HUFFMAN_EMIT_SYMBOL, 37),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 37),
+ (2, HUFFMAN_EMIT_SYMBOL, 45),
+ (9, HUFFMAN_EMIT_SYMBOL, 45),
+ (23, HUFFMAN_EMIT_SYMBOL, 45),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 45),
+ (2, HUFFMAN_EMIT_SYMBOL, 46),
+ (9, HUFFMAN_EMIT_SYMBOL, 46),
+ (23, HUFFMAN_EMIT_SYMBOL, 46),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 46),
+
+ # Node 13
+ (3, HUFFMAN_EMIT_SYMBOL, 32),
+ (6, HUFFMAN_EMIT_SYMBOL, 32),
+ (10, HUFFMAN_EMIT_SYMBOL, 32),
+ (15, HUFFMAN_EMIT_SYMBOL, 32),
+ (24, HUFFMAN_EMIT_SYMBOL, 32),
+ (31, HUFFMAN_EMIT_SYMBOL, 32),
+ (41, HUFFMAN_EMIT_SYMBOL, 32),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 32),
+ (3, HUFFMAN_EMIT_SYMBOL, 37),
+ (6, HUFFMAN_EMIT_SYMBOL, 37),
+ (10, HUFFMAN_EMIT_SYMBOL, 37),
+ (15, HUFFMAN_EMIT_SYMBOL, 37),
+ (24, HUFFMAN_EMIT_SYMBOL, 37),
+ (31, HUFFMAN_EMIT_SYMBOL, 37),
+ (41, HUFFMAN_EMIT_SYMBOL, 37),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 37),
+
+ # Node 14
+ (3, HUFFMAN_EMIT_SYMBOL, 45),
+ (6, HUFFMAN_EMIT_SYMBOL, 45),
+ (10, HUFFMAN_EMIT_SYMBOL, 45),
+ (15, HUFFMAN_EMIT_SYMBOL, 45),
+ (24, HUFFMAN_EMIT_SYMBOL, 45),
+ (31, HUFFMAN_EMIT_SYMBOL, 45),
+ (41, HUFFMAN_EMIT_SYMBOL, 45),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 45),
+ (3, HUFFMAN_EMIT_SYMBOL, 46),
+ (6, HUFFMAN_EMIT_SYMBOL, 46),
+ (10, HUFFMAN_EMIT_SYMBOL, 46),
+ (15, HUFFMAN_EMIT_SYMBOL, 46),
+ (24, HUFFMAN_EMIT_SYMBOL, 46),
+ (31, HUFFMAN_EMIT_SYMBOL, 46),
+ (41, HUFFMAN_EMIT_SYMBOL, 46),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 46),
+
+ # Node 15
+ (1, HUFFMAN_EMIT_SYMBOL, 47),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 47),
+ (1, HUFFMAN_EMIT_SYMBOL, 51),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 51),
+ (1, HUFFMAN_EMIT_SYMBOL, 52),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 52),
+ (1, HUFFMAN_EMIT_SYMBOL, 53),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 53),
+ (1, HUFFMAN_EMIT_SYMBOL, 54),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 54),
+ (1, HUFFMAN_EMIT_SYMBOL, 55),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 55),
+ (1, HUFFMAN_EMIT_SYMBOL, 56),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 56),
+ (1, HUFFMAN_EMIT_SYMBOL, 57),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 57),
+
+ # Node 16
+ (2, HUFFMAN_EMIT_SYMBOL, 47),
+ (9, HUFFMAN_EMIT_SYMBOL, 47),
+ (23, HUFFMAN_EMIT_SYMBOL, 47),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 47),
+ (2, HUFFMAN_EMIT_SYMBOL, 51),
+ (9, HUFFMAN_EMIT_SYMBOL, 51),
+ (23, HUFFMAN_EMIT_SYMBOL, 51),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 51),
+ (2, HUFFMAN_EMIT_SYMBOL, 52),
+ (9, HUFFMAN_EMIT_SYMBOL, 52),
+ (23, HUFFMAN_EMIT_SYMBOL, 52),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 52),
+ (2, HUFFMAN_EMIT_SYMBOL, 53),
+ (9, HUFFMAN_EMIT_SYMBOL, 53),
+ (23, HUFFMAN_EMIT_SYMBOL, 53),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 53),
+
+ # Node 17
+ (3, HUFFMAN_EMIT_SYMBOL, 47),
+ (6, HUFFMAN_EMIT_SYMBOL, 47),
+ (10, HUFFMAN_EMIT_SYMBOL, 47),
+ (15, HUFFMAN_EMIT_SYMBOL, 47),
+ (24, HUFFMAN_EMIT_SYMBOL, 47),
+ (31, HUFFMAN_EMIT_SYMBOL, 47),
+ (41, HUFFMAN_EMIT_SYMBOL, 47),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 47),
+ (3, HUFFMAN_EMIT_SYMBOL, 51),
+ (6, HUFFMAN_EMIT_SYMBOL, 51),
+ (10, HUFFMAN_EMIT_SYMBOL, 51),
+ (15, HUFFMAN_EMIT_SYMBOL, 51),
+ (24, HUFFMAN_EMIT_SYMBOL, 51),
+ (31, HUFFMAN_EMIT_SYMBOL, 51),
+ (41, HUFFMAN_EMIT_SYMBOL, 51),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 51),
+
+ # Node 18
+ (3, HUFFMAN_EMIT_SYMBOL, 52),
+ (6, HUFFMAN_EMIT_SYMBOL, 52),
+ (10, HUFFMAN_EMIT_SYMBOL, 52),
+ (15, HUFFMAN_EMIT_SYMBOL, 52),
+ (24, HUFFMAN_EMIT_SYMBOL, 52),
+ (31, HUFFMAN_EMIT_SYMBOL, 52),
+ (41, HUFFMAN_EMIT_SYMBOL, 52),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 52),
+ (3, HUFFMAN_EMIT_SYMBOL, 53),
+ (6, HUFFMAN_EMIT_SYMBOL, 53),
+ (10, HUFFMAN_EMIT_SYMBOL, 53),
+ (15, HUFFMAN_EMIT_SYMBOL, 53),
+ (24, HUFFMAN_EMIT_SYMBOL, 53),
+ (31, HUFFMAN_EMIT_SYMBOL, 53),
+ (41, HUFFMAN_EMIT_SYMBOL, 53),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 53),
+
+ # Node 19
+ (2, HUFFMAN_EMIT_SYMBOL, 54),
+ (9, HUFFMAN_EMIT_SYMBOL, 54),
+ (23, HUFFMAN_EMIT_SYMBOL, 54),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 54),
+ (2, HUFFMAN_EMIT_SYMBOL, 55),
+ (9, HUFFMAN_EMIT_SYMBOL, 55),
+ (23, HUFFMAN_EMIT_SYMBOL, 55),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 55),
+ (2, HUFFMAN_EMIT_SYMBOL, 56),
+ (9, HUFFMAN_EMIT_SYMBOL, 56),
+ (23, HUFFMAN_EMIT_SYMBOL, 56),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 56),
+ (2, HUFFMAN_EMIT_SYMBOL, 57),
+ (9, HUFFMAN_EMIT_SYMBOL, 57),
+ (23, HUFFMAN_EMIT_SYMBOL, 57),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 57),
+
+ # Node 20
+ (3, HUFFMAN_EMIT_SYMBOL, 54),
+ (6, HUFFMAN_EMIT_SYMBOL, 54),
+ (10, HUFFMAN_EMIT_SYMBOL, 54),
+ (15, HUFFMAN_EMIT_SYMBOL, 54),
+ (24, HUFFMAN_EMIT_SYMBOL, 54),
+ (31, HUFFMAN_EMIT_SYMBOL, 54),
+ (41, HUFFMAN_EMIT_SYMBOL, 54),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 54),
+ (3, HUFFMAN_EMIT_SYMBOL, 55),
+ (6, HUFFMAN_EMIT_SYMBOL, 55),
+ (10, HUFFMAN_EMIT_SYMBOL, 55),
+ (15, HUFFMAN_EMIT_SYMBOL, 55),
+ (24, HUFFMAN_EMIT_SYMBOL, 55),
+ (31, HUFFMAN_EMIT_SYMBOL, 55),
+ (41, HUFFMAN_EMIT_SYMBOL, 55),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 55),
+
+ # Node 21
+ (3, HUFFMAN_EMIT_SYMBOL, 56),
+ (6, HUFFMAN_EMIT_SYMBOL, 56),
+ (10, HUFFMAN_EMIT_SYMBOL, 56),
+ (15, HUFFMAN_EMIT_SYMBOL, 56),
+ (24, HUFFMAN_EMIT_SYMBOL, 56),
+ (31, HUFFMAN_EMIT_SYMBOL, 56),
+ (41, HUFFMAN_EMIT_SYMBOL, 56),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 56),
+ (3, HUFFMAN_EMIT_SYMBOL, 57),
+ (6, HUFFMAN_EMIT_SYMBOL, 57),
+ (10, HUFFMAN_EMIT_SYMBOL, 57),
+ (15, HUFFMAN_EMIT_SYMBOL, 57),
+ (24, HUFFMAN_EMIT_SYMBOL, 57),
+ (31, HUFFMAN_EMIT_SYMBOL, 57),
+ (41, HUFFMAN_EMIT_SYMBOL, 57),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 57),
+
+ # Node 22
+ (26, 0, 0),
+ (27, 0, 0),
+ (29, 0, 0),
+ (30, 0, 0),
+ (33, 0, 0),
+ (34, 0, 0),
+ (36, 0, 0),
+ (37, 0, 0),
+ (43, 0, 0),
+ (46, 0, 0),
+ (50, 0, 0),
+ (53, 0, 0),
+ (58, 0, 0),
+ (61, 0, 0),
+ (65, 0, 0),
+ (68, HUFFMAN_COMPLETE, 0),
+
+ # Node 23
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 61),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 65),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 95),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 98),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 100),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 102),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 103),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 104),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 108),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 109),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 110),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 112),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 114),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 117),
+ (38, 0, 0),
+ (39, 0, 0),
+
+ # Node 24
+ (1, HUFFMAN_EMIT_SYMBOL, 61),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 61),
+ (1, HUFFMAN_EMIT_SYMBOL, 65),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 65),
+ (1, HUFFMAN_EMIT_SYMBOL, 95),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 95),
+ (1, HUFFMAN_EMIT_SYMBOL, 98),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 98),
+ (1, HUFFMAN_EMIT_SYMBOL, 100),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 100),
+ (1, HUFFMAN_EMIT_SYMBOL, 102),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 102),
+ (1, HUFFMAN_EMIT_SYMBOL, 103),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 103),
+ (1, HUFFMAN_EMIT_SYMBOL, 104),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 104),
+
+ # Node 25
+ (2, HUFFMAN_EMIT_SYMBOL, 61),
+ (9, HUFFMAN_EMIT_SYMBOL, 61),
+ (23, HUFFMAN_EMIT_SYMBOL, 61),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 61),
+ (2, HUFFMAN_EMIT_SYMBOL, 65),
+ (9, HUFFMAN_EMIT_SYMBOL, 65),
+ (23, HUFFMAN_EMIT_SYMBOL, 65),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 65),
+ (2, HUFFMAN_EMIT_SYMBOL, 95),
+ (9, HUFFMAN_EMIT_SYMBOL, 95),
+ (23, HUFFMAN_EMIT_SYMBOL, 95),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 95),
+ (2, HUFFMAN_EMIT_SYMBOL, 98),
+ (9, HUFFMAN_EMIT_SYMBOL, 98),
+ (23, HUFFMAN_EMIT_SYMBOL, 98),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 98),
+
+ # Node 26
+ (3, HUFFMAN_EMIT_SYMBOL, 61),
+ (6, HUFFMAN_EMIT_SYMBOL, 61),
+ (10, HUFFMAN_EMIT_SYMBOL, 61),
+ (15, HUFFMAN_EMIT_SYMBOL, 61),
+ (24, HUFFMAN_EMIT_SYMBOL, 61),
+ (31, HUFFMAN_EMIT_SYMBOL, 61),
+ (41, HUFFMAN_EMIT_SYMBOL, 61),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 61),
+ (3, HUFFMAN_EMIT_SYMBOL, 65),
+ (6, HUFFMAN_EMIT_SYMBOL, 65),
+ (10, HUFFMAN_EMIT_SYMBOL, 65),
+ (15, HUFFMAN_EMIT_SYMBOL, 65),
+ (24, HUFFMAN_EMIT_SYMBOL, 65),
+ (31, HUFFMAN_EMIT_SYMBOL, 65),
+ (41, HUFFMAN_EMIT_SYMBOL, 65),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 65),
+
+ # Node 27
+ (3, HUFFMAN_EMIT_SYMBOL, 95),
+ (6, HUFFMAN_EMIT_SYMBOL, 95),
+ (10, HUFFMAN_EMIT_SYMBOL, 95),
+ (15, HUFFMAN_EMIT_SYMBOL, 95),
+ (24, HUFFMAN_EMIT_SYMBOL, 95),
+ (31, HUFFMAN_EMIT_SYMBOL, 95),
+ (41, HUFFMAN_EMIT_SYMBOL, 95),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 95),
+ (3, HUFFMAN_EMIT_SYMBOL, 98),
+ (6, HUFFMAN_EMIT_SYMBOL, 98),
+ (10, HUFFMAN_EMIT_SYMBOL, 98),
+ (15, HUFFMAN_EMIT_SYMBOL, 98),
+ (24, HUFFMAN_EMIT_SYMBOL, 98),
+ (31, HUFFMAN_EMIT_SYMBOL, 98),
+ (41, HUFFMAN_EMIT_SYMBOL, 98),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 98),
+
+ # Node 28
+ (2, HUFFMAN_EMIT_SYMBOL, 100),
+ (9, HUFFMAN_EMIT_SYMBOL, 100),
+ (23, HUFFMAN_EMIT_SYMBOL, 100),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 100),
+ (2, HUFFMAN_EMIT_SYMBOL, 102),
+ (9, HUFFMAN_EMIT_SYMBOL, 102),
+ (23, HUFFMAN_EMIT_SYMBOL, 102),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 102),
+ (2, HUFFMAN_EMIT_SYMBOL, 103),
+ (9, HUFFMAN_EMIT_SYMBOL, 103),
+ (23, HUFFMAN_EMIT_SYMBOL, 103),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 103),
+ (2, HUFFMAN_EMIT_SYMBOL, 104),
+ (9, HUFFMAN_EMIT_SYMBOL, 104),
+ (23, HUFFMAN_EMIT_SYMBOL, 104),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 104),
+
+ # Node 29
+ (3, HUFFMAN_EMIT_SYMBOL, 100),
+ (6, HUFFMAN_EMIT_SYMBOL, 100),
+ (10, HUFFMAN_EMIT_SYMBOL, 100),
+ (15, HUFFMAN_EMIT_SYMBOL, 100),
+ (24, HUFFMAN_EMIT_SYMBOL, 100),
+ (31, HUFFMAN_EMIT_SYMBOL, 100),
+ (41, HUFFMAN_EMIT_SYMBOL, 100),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 100),
+ (3, HUFFMAN_EMIT_SYMBOL, 102),
+ (6, HUFFMAN_EMIT_SYMBOL, 102),
+ (10, HUFFMAN_EMIT_SYMBOL, 102),
+ (15, HUFFMAN_EMIT_SYMBOL, 102),
+ (24, HUFFMAN_EMIT_SYMBOL, 102),
+ (31, HUFFMAN_EMIT_SYMBOL, 102),
+ (41, HUFFMAN_EMIT_SYMBOL, 102),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 102),
+
+ # Node 30
+ (3, HUFFMAN_EMIT_SYMBOL, 103),
+ (6, HUFFMAN_EMIT_SYMBOL, 103),
+ (10, HUFFMAN_EMIT_SYMBOL, 103),
+ (15, HUFFMAN_EMIT_SYMBOL, 103),
+ (24, HUFFMAN_EMIT_SYMBOL, 103),
+ (31, HUFFMAN_EMIT_SYMBOL, 103),
+ (41, HUFFMAN_EMIT_SYMBOL, 103),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 103),
+ (3, HUFFMAN_EMIT_SYMBOL, 104),
+ (6, HUFFMAN_EMIT_SYMBOL, 104),
+ (10, HUFFMAN_EMIT_SYMBOL, 104),
+ (15, HUFFMAN_EMIT_SYMBOL, 104),
+ (24, HUFFMAN_EMIT_SYMBOL, 104),
+ (31, HUFFMAN_EMIT_SYMBOL, 104),
+ (41, HUFFMAN_EMIT_SYMBOL, 104),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 104),
+
+ # Node 31
+ (1, HUFFMAN_EMIT_SYMBOL, 108),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 108),
+ (1, HUFFMAN_EMIT_SYMBOL, 109),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 109),
+ (1, HUFFMAN_EMIT_SYMBOL, 110),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 110),
+ (1, HUFFMAN_EMIT_SYMBOL, 112),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 112),
+ (1, HUFFMAN_EMIT_SYMBOL, 114),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 114),
+ (1, HUFFMAN_EMIT_SYMBOL, 117),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 117),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 58),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 66),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 67),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 68),
+
+ # Node 32
+ (2, HUFFMAN_EMIT_SYMBOL, 108),
+ (9, HUFFMAN_EMIT_SYMBOL, 108),
+ (23, HUFFMAN_EMIT_SYMBOL, 108),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 108),
+ (2, HUFFMAN_EMIT_SYMBOL, 109),
+ (9, HUFFMAN_EMIT_SYMBOL, 109),
+ (23, HUFFMAN_EMIT_SYMBOL, 109),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 109),
+ (2, HUFFMAN_EMIT_SYMBOL, 110),
+ (9, HUFFMAN_EMIT_SYMBOL, 110),
+ (23, HUFFMAN_EMIT_SYMBOL, 110),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 110),
+ (2, HUFFMAN_EMIT_SYMBOL, 112),
+ (9, HUFFMAN_EMIT_SYMBOL, 112),
+ (23, HUFFMAN_EMIT_SYMBOL, 112),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 112),
+
+ # Node 33
+ (3, HUFFMAN_EMIT_SYMBOL, 108),
+ (6, HUFFMAN_EMIT_SYMBOL, 108),
+ (10, HUFFMAN_EMIT_SYMBOL, 108),
+ (15, HUFFMAN_EMIT_SYMBOL, 108),
+ (24, HUFFMAN_EMIT_SYMBOL, 108),
+ (31, HUFFMAN_EMIT_SYMBOL, 108),
+ (41, HUFFMAN_EMIT_SYMBOL, 108),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 108),
+ (3, HUFFMAN_EMIT_SYMBOL, 109),
+ (6, HUFFMAN_EMIT_SYMBOL, 109),
+ (10, HUFFMAN_EMIT_SYMBOL, 109),
+ (15, HUFFMAN_EMIT_SYMBOL, 109),
+ (24, HUFFMAN_EMIT_SYMBOL, 109),
+ (31, HUFFMAN_EMIT_SYMBOL, 109),
+ (41, HUFFMAN_EMIT_SYMBOL, 109),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 109),
+
+ # Node 34
+ (3, HUFFMAN_EMIT_SYMBOL, 110),
+ (6, HUFFMAN_EMIT_SYMBOL, 110),
+ (10, HUFFMAN_EMIT_SYMBOL, 110),
+ (15, HUFFMAN_EMIT_SYMBOL, 110),
+ (24, HUFFMAN_EMIT_SYMBOL, 110),
+ (31, HUFFMAN_EMIT_SYMBOL, 110),
+ (41, HUFFMAN_EMIT_SYMBOL, 110),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 110),
+ (3, HUFFMAN_EMIT_SYMBOL, 112),
+ (6, HUFFMAN_EMIT_SYMBOL, 112),
+ (10, HUFFMAN_EMIT_SYMBOL, 112),
+ (15, HUFFMAN_EMIT_SYMBOL, 112),
+ (24, HUFFMAN_EMIT_SYMBOL, 112),
+ (31, HUFFMAN_EMIT_SYMBOL, 112),
+ (41, HUFFMAN_EMIT_SYMBOL, 112),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 112),
+
+ # Node 35
+ (2, HUFFMAN_EMIT_SYMBOL, 114),
+ (9, HUFFMAN_EMIT_SYMBOL, 114),
+ (23, HUFFMAN_EMIT_SYMBOL, 114),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 114),
+ (2, HUFFMAN_EMIT_SYMBOL, 117),
+ (9, HUFFMAN_EMIT_SYMBOL, 117),
+ (23, HUFFMAN_EMIT_SYMBOL, 117),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 117),
+ (1, HUFFMAN_EMIT_SYMBOL, 58),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 58),
+ (1, HUFFMAN_EMIT_SYMBOL, 66),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 66),
+ (1, HUFFMAN_EMIT_SYMBOL, 67),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 67),
+ (1, HUFFMAN_EMIT_SYMBOL, 68),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 68),
+
+ # Node 36
+ (3, HUFFMAN_EMIT_SYMBOL, 114),
+ (6, HUFFMAN_EMIT_SYMBOL, 114),
+ (10, HUFFMAN_EMIT_SYMBOL, 114),
+ (15, HUFFMAN_EMIT_SYMBOL, 114),
+ (24, HUFFMAN_EMIT_SYMBOL, 114),
+ (31, HUFFMAN_EMIT_SYMBOL, 114),
+ (41, HUFFMAN_EMIT_SYMBOL, 114),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 114),
+ (3, HUFFMAN_EMIT_SYMBOL, 117),
+ (6, HUFFMAN_EMIT_SYMBOL, 117),
+ (10, HUFFMAN_EMIT_SYMBOL, 117),
+ (15, HUFFMAN_EMIT_SYMBOL, 117),
+ (24, HUFFMAN_EMIT_SYMBOL, 117),
+ (31, HUFFMAN_EMIT_SYMBOL, 117),
+ (41, HUFFMAN_EMIT_SYMBOL, 117),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 117),
+
+ # Node 37
+ (2, HUFFMAN_EMIT_SYMBOL, 58),
+ (9, HUFFMAN_EMIT_SYMBOL, 58),
+ (23, HUFFMAN_EMIT_SYMBOL, 58),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 58),
+ (2, HUFFMAN_EMIT_SYMBOL, 66),
+ (9, HUFFMAN_EMIT_SYMBOL, 66),
+ (23, HUFFMAN_EMIT_SYMBOL, 66),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 66),
+ (2, HUFFMAN_EMIT_SYMBOL, 67),
+ (9, HUFFMAN_EMIT_SYMBOL, 67),
+ (23, HUFFMAN_EMIT_SYMBOL, 67),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 67),
+ (2, HUFFMAN_EMIT_SYMBOL, 68),
+ (9, HUFFMAN_EMIT_SYMBOL, 68),
+ (23, HUFFMAN_EMIT_SYMBOL, 68),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 68),
+
+ # Node 38
+ (3, HUFFMAN_EMIT_SYMBOL, 58),
+ (6, HUFFMAN_EMIT_SYMBOL, 58),
+ (10, HUFFMAN_EMIT_SYMBOL, 58),
+ (15, HUFFMAN_EMIT_SYMBOL, 58),
+ (24, HUFFMAN_EMIT_SYMBOL, 58),
+ (31, HUFFMAN_EMIT_SYMBOL, 58),
+ (41, HUFFMAN_EMIT_SYMBOL, 58),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 58),
+ (3, HUFFMAN_EMIT_SYMBOL, 66),
+ (6, HUFFMAN_EMIT_SYMBOL, 66),
+ (10, HUFFMAN_EMIT_SYMBOL, 66),
+ (15, HUFFMAN_EMIT_SYMBOL, 66),
+ (24, HUFFMAN_EMIT_SYMBOL, 66),
+ (31, HUFFMAN_EMIT_SYMBOL, 66),
+ (41, HUFFMAN_EMIT_SYMBOL, 66),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 66),
+
+ # Node 39
+ (3, HUFFMAN_EMIT_SYMBOL, 67),
+ (6, HUFFMAN_EMIT_SYMBOL, 67),
+ (10, HUFFMAN_EMIT_SYMBOL, 67),
+ (15, HUFFMAN_EMIT_SYMBOL, 67),
+ (24, HUFFMAN_EMIT_SYMBOL, 67),
+ (31, HUFFMAN_EMIT_SYMBOL, 67),
+ (41, HUFFMAN_EMIT_SYMBOL, 67),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 67),
+ (3, HUFFMAN_EMIT_SYMBOL, 68),
+ (6, HUFFMAN_EMIT_SYMBOL, 68),
+ (10, HUFFMAN_EMIT_SYMBOL, 68),
+ (15, HUFFMAN_EMIT_SYMBOL, 68),
+ (24, HUFFMAN_EMIT_SYMBOL, 68),
+ (31, HUFFMAN_EMIT_SYMBOL, 68),
+ (41, HUFFMAN_EMIT_SYMBOL, 68),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 68),
+
+ # Node 40
+ (44, 0, 0),
+ (45, 0, 0),
+ (47, 0, 0),
+ (48, 0, 0),
+ (51, 0, 0),
+ (52, 0, 0),
+ (54, 0, 0),
+ (55, 0, 0),
+ (59, 0, 0),
+ (60, 0, 0),
+ (62, 0, 0),
+ (63, 0, 0),
+ (66, 0, 0),
+ (67, 0, 0),
+ (69, 0, 0),
+ (72, HUFFMAN_COMPLETE, 0),
+
+ # Node 41
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 69),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 70),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 71),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 72),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 73),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 74),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 75),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 76),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 77),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 78),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 79),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 80),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 81),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 82),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 83),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 84),
+
+ # Node 42
+ (1, HUFFMAN_EMIT_SYMBOL, 69),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 69),
+ (1, HUFFMAN_EMIT_SYMBOL, 70),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 70),
+ (1, HUFFMAN_EMIT_SYMBOL, 71),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 71),
+ (1, HUFFMAN_EMIT_SYMBOL, 72),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 72),
+ (1, HUFFMAN_EMIT_SYMBOL, 73),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 73),
+ (1, HUFFMAN_EMIT_SYMBOL, 74),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 74),
+ (1, HUFFMAN_EMIT_SYMBOL, 75),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 75),
+ (1, HUFFMAN_EMIT_SYMBOL, 76),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 76),
+
+ # Node 43
+ (2, HUFFMAN_EMIT_SYMBOL, 69),
+ (9, HUFFMAN_EMIT_SYMBOL, 69),
+ (23, HUFFMAN_EMIT_SYMBOL, 69),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 69),
+ (2, HUFFMAN_EMIT_SYMBOL, 70),
+ (9, HUFFMAN_EMIT_SYMBOL, 70),
+ (23, HUFFMAN_EMIT_SYMBOL, 70),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 70),
+ (2, HUFFMAN_EMIT_SYMBOL, 71),
+ (9, HUFFMAN_EMIT_SYMBOL, 71),
+ (23, HUFFMAN_EMIT_SYMBOL, 71),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 71),
+ (2, HUFFMAN_EMIT_SYMBOL, 72),
+ (9, HUFFMAN_EMIT_SYMBOL, 72),
+ (23, HUFFMAN_EMIT_SYMBOL, 72),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 72),
+
+ # Node 44
+ (3, HUFFMAN_EMIT_SYMBOL, 69),
+ (6, HUFFMAN_EMIT_SYMBOL, 69),
+ (10, HUFFMAN_EMIT_SYMBOL, 69),
+ (15, HUFFMAN_EMIT_SYMBOL, 69),
+ (24, HUFFMAN_EMIT_SYMBOL, 69),
+ (31, HUFFMAN_EMIT_SYMBOL, 69),
+ (41, HUFFMAN_EMIT_SYMBOL, 69),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 69),
+ (3, HUFFMAN_EMIT_SYMBOL, 70),
+ (6, HUFFMAN_EMIT_SYMBOL, 70),
+ (10, HUFFMAN_EMIT_SYMBOL, 70),
+ (15, HUFFMAN_EMIT_SYMBOL, 70),
+ (24, HUFFMAN_EMIT_SYMBOL, 70),
+ (31, HUFFMAN_EMIT_SYMBOL, 70),
+ (41, HUFFMAN_EMIT_SYMBOL, 70),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 70),
+
+ # Node 45
+ (3, HUFFMAN_EMIT_SYMBOL, 71),
+ (6, HUFFMAN_EMIT_SYMBOL, 71),
+ (10, HUFFMAN_EMIT_SYMBOL, 71),
+ (15, HUFFMAN_EMIT_SYMBOL, 71),
+ (24, HUFFMAN_EMIT_SYMBOL, 71),
+ (31, HUFFMAN_EMIT_SYMBOL, 71),
+ (41, HUFFMAN_EMIT_SYMBOL, 71),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 71),
+ (3, HUFFMAN_EMIT_SYMBOL, 72),
+ (6, HUFFMAN_EMIT_SYMBOL, 72),
+ (10, HUFFMAN_EMIT_SYMBOL, 72),
+ (15, HUFFMAN_EMIT_SYMBOL, 72),
+ (24, HUFFMAN_EMIT_SYMBOL, 72),
+ (31, HUFFMAN_EMIT_SYMBOL, 72),
+ (41, HUFFMAN_EMIT_SYMBOL, 72),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 72),
+
+ # Node 46
+ (2, HUFFMAN_EMIT_SYMBOL, 73),
+ (9, HUFFMAN_EMIT_SYMBOL, 73),
+ (23, HUFFMAN_EMIT_SYMBOL, 73),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 73),
+ (2, HUFFMAN_EMIT_SYMBOL, 74),
+ (9, HUFFMAN_EMIT_SYMBOL, 74),
+ (23, HUFFMAN_EMIT_SYMBOL, 74),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 74),
+ (2, HUFFMAN_EMIT_SYMBOL, 75),
+ (9, HUFFMAN_EMIT_SYMBOL, 75),
+ (23, HUFFMAN_EMIT_SYMBOL, 75),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 75),
+ (2, HUFFMAN_EMIT_SYMBOL, 76),
+ (9, HUFFMAN_EMIT_SYMBOL, 76),
+ (23, HUFFMAN_EMIT_SYMBOL, 76),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 76),
+
+ # Node 47
+ (3, HUFFMAN_EMIT_SYMBOL, 73),
+ (6, HUFFMAN_EMIT_SYMBOL, 73),
+ (10, HUFFMAN_EMIT_SYMBOL, 73),
+ (15, HUFFMAN_EMIT_SYMBOL, 73),
+ (24, HUFFMAN_EMIT_SYMBOL, 73),
+ (31, HUFFMAN_EMIT_SYMBOL, 73),
+ (41, HUFFMAN_EMIT_SYMBOL, 73),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 73),
+ (3, HUFFMAN_EMIT_SYMBOL, 74),
+ (6, HUFFMAN_EMIT_SYMBOL, 74),
+ (10, HUFFMAN_EMIT_SYMBOL, 74),
+ (15, HUFFMAN_EMIT_SYMBOL, 74),
+ (24, HUFFMAN_EMIT_SYMBOL, 74),
+ (31, HUFFMAN_EMIT_SYMBOL, 74),
+ (41, HUFFMAN_EMIT_SYMBOL, 74),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 74),
+
+ # Node 48
+ (3, HUFFMAN_EMIT_SYMBOL, 75),
+ (6, HUFFMAN_EMIT_SYMBOL, 75),
+ (10, HUFFMAN_EMIT_SYMBOL, 75),
+ (15, HUFFMAN_EMIT_SYMBOL, 75),
+ (24, HUFFMAN_EMIT_SYMBOL, 75),
+ (31, HUFFMAN_EMIT_SYMBOL, 75),
+ (41, HUFFMAN_EMIT_SYMBOL, 75),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 75),
+ (3, HUFFMAN_EMIT_SYMBOL, 76),
+ (6, HUFFMAN_EMIT_SYMBOL, 76),
+ (10, HUFFMAN_EMIT_SYMBOL, 76),
+ (15, HUFFMAN_EMIT_SYMBOL, 76),
+ (24, HUFFMAN_EMIT_SYMBOL, 76),
+ (31, HUFFMAN_EMIT_SYMBOL, 76),
+ (41, HUFFMAN_EMIT_SYMBOL, 76),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 76),
+
+ # Node 49
+ (1, HUFFMAN_EMIT_SYMBOL, 77),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 77),
+ (1, HUFFMAN_EMIT_SYMBOL, 78),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 78),
+ (1, HUFFMAN_EMIT_SYMBOL, 79),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 79),
+ (1, HUFFMAN_EMIT_SYMBOL, 80),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 80),
+ (1, HUFFMAN_EMIT_SYMBOL, 81),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 81),
+ (1, HUFFMAN_EMIT_SYMBOL, 82),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 82),
+ (1, HUFFMAN_EMIT_SYMBOL, 83),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 83),
+ (1, HUFFMAN_EMIT_SYMBOL, 84),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 84),
+
+ # Node 50
+ (2, HUFFMAN_EMIT_SYMBOL, 77),
+ (9, HUFFMAN_EMIT_SYMBOL, 77),
+ (23, HUFFMAN_EMIT_SYMBOL, 77),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 77),
+ (2, HUFFMAN_EMIT_SYMBOL, 78),
+ (9, HUFFMAN_EMIT_SYMBOL, 78),
+ (23, HUFFMAN_EMIT_SYMBOL, 78),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 78),
+ (2, HUFFMAN_EMIT_SYMBOL, 79),
+ (9, HUFFMAN_EMIT_SYMBOL, 79),
+ (23, HUFFMAN_EMIT_SYMBOL, 79),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 79),
+ (2, HUFFMAN_EMIT_SYMBOL, 80),
+ (9, HUFFMAN_EMIT_SYMBOL, 80),
+ (23, HUFFMAN_EMIT_SYMBOL, 80),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 80),
+
+ # Node 51
+ (3, HUFFMAN_EMIT_SYMBOL, 77),
+ (6, HUFFMAN_EMIT_SYMBOL, 77),
+ (10, HUFFMAN_EMIT_SYMBOL, 77),
+ (15, HUFFMAN_EMIT_SYMBOL, 77),
+ (24, HUFFMAN_EMIT_SYMBOL, 77),
+ (31, HUFFMAN_EMIT_SYMBOL, 77),
+ (41, HUFFMAN_EMIT_SYMBOL, 77),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 77),
+ (3, HUFFMAN_EMIT_SYMBOL, 78),
+ (6, HUFFMAN_EMIT_SYMBOL, 78),
+ (10, HUFFMAN_EMIT_SYMBOL, 78),
+ (15, HUFFMAN_EMIT_SYMBOL, 78),
+ (24, HUFFMAN_EMIT_SYMBOL, 78),
+ (31, HUFFMAN_EMIT_SYMBOL, 78),
+ (41, HUFFMAN_EMIT_SYMBOL, 78),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 78),
+
+ # Node 52
+ (3, HUFFMAN_EMIT_SYMBOL, 79),
+ (6, HUFFMAN_EMIT_SYMBOL, 79),
+ (10, HUFFMAN_EMIT_SYMBOL, 79),
+ (15, HUFFMAN_EMIT_SYMBOL, 79),
+ (24, HUFFMAN_EMIT_SYMBOL, 79),
+ (31, HUFFMAN_EMIT_SYMBOL, 79),
+ (41, HUFFMAN_EMIT_SYMBOL, 79),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 79),
+ (3, HUFFMAN_EMIT_SYMBOL, 80),
+ (6, HUFFMAN_EMIT_SYMBOL, 80),
+ (10, HUFFMAN_EMIT_SYMBOL, 80),
+ (15, HUFFMAN_EMIT_SYMBOL, 80),
+ (24, HUFFMAN_EMIT_SYMBOL, 80),
+ (31, HUFFMAN_EMIT_SYMBOL, 80),
+ (41, HUFFMAN_EMIT_SYMBOL, 80),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 80),
+
+ # Node 53
+ (2, HUFFMAN_EMIT_SYMBOL, 81),
+ (9, HUFFMAN_EMIT_SYMBOL, 81),
+ (23, HUFFMAN_EMIT_SYMBOL, 81),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 81),
+ (2, HUFFMAN_EMIT_SYMBOL, 82),
+ (9, HUFFMAN_EMIT_SYMBOL, 82),
+ (23, HUFFMAN_EMIT_SYMBOL, 82),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 82),
+ (2, HUFFMAN_EMIT_SYMBOL, 83),
+ (9, HUFFMAN_EMIT_SYMBOL, 83),
+ (23, HUFFMAN_EMIT_SYMBOL, 83),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 83),
+ (2, HUFFMAN_EMIT_SYMBOL, 84),
+ (9, HUFFMAN_EMIT_SYMBOL, 84),
+ (23, HUFFMAN_EMIT_SYMBOL, 84),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 84),
+
+ # Node 54
+ (3, HUFFMAN_EMIT_SYMBOL, 81),
+ (6, HUFFMAN_EMIT_SYMBOL, 81),
+ (10, HUFFMAN_EMIT_SYMBOL, 81),
+ (15, HUFFMAN_EMIT_SYMBOL, 81),
+ (24, HUFFMAN_EMIT_SYMBOL, 81),
+ (31, HUFFMAN_EMIT_SYMBOL, 81),
+ (41, HUFFMAN_EMIT_SYMBOL, 81),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 81),
+ (3, HUFFMAN_EMIT_SYMBOL, 82),
+ (6, HUFFMAN_EMIT_SYMBOL, 82),
+ (10, HUFFMAN_EMIT_SYMBOL, 82),
+ (15, HUFFMAN_EMIT_SYMBOL, 82),
+ (24, HUFFMAN_EMIT_SYMBOL, 82),
+ (31, HUFFMAN_EMIT_SYMBOL, 82),
+ (41, HUFFMAN_EMIT_SYMBOL, 82),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 82),
+
+ # Node 55
+ (3, HUFFMAN_EMIT_SYMBOL, 83),
+ (6, HUFFMAN_EMIT_SYMBOL, 83),
+ (10, HUFFMAN_EMIT_SYMBOL, 83),
+ (15, HUFFMAN_EMIT_SYMBOL, 83),
+ (24, HUFFMAN_EMIT_SYMBOL, 83),
+ (31, HUFFMAN_EMIT_SYMBOL, 83),
+ (41, HUFFMAN_EMIT_SYMBOL, 83),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 83),
+ (3, HUFFMAN_EMIT_SYMBOL, 84),
+ (6, HUFFMAN_EMIT_SYMBOL, 84),
+ (10, HUFFMAN_EMIT_SYMBOL, 84),
+ (15, HUFFMAN_EMIT_SYMBOL, 84),
+ (24, HUFFMAN_EMIT_SYMBOL, 84),
+ (31, HUFFMAN_EMIT_SYMBOL, 84),
+ (41, HUFFMAN_EMIT_SYMBOL, 84),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 84),
+
+ # Node 56
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 85),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 86),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 87),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 89),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 106),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 107),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 113),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 118),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 119),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 120),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 121),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 122),
+ (70, 0, 0),
+ (71, 0, 0),
+ (73, 0, 0),
+ (74, HUFFMAN_COMPLETE, 0),
+
+ # Node 57
+ (1, HUFFMAN_EMIT_SYMBOL, 85),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 85),
+ (1, HUFFMAN_EMIT_SYMBOL, 86),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 86),
+ (1, HUFFMAN_EMIT_SYMBOL, 87),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 87),
+ (1, HUFFMAN_EMIT_SYMBOL, 89),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 89),
+ (1, HUFFMAN_EMIT_SYMBOL, 106),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 106),
+ (1, HUFFMAN_EMIT_SYMBOL, 107),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 107),
+ (1, HUFFMAN_EMIT_SYMBOL, 113),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 113),
+ (1, HUFFMAN_EMIT_SYMBOL, 118),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 118),
+
+ # Node 58
+ (2, HUFFMAN_EMIT_SYMBOL, 85),
+ (9, HUFFMAN_EMIT_SYMBOL, 85),
+ (23, HUFFMAN_EMIT_SYMBOL, 85),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 85),
+ (2, HUFFMAN_EMIT_SYMBOL, 86),
+ (9, HUFFMAN_EMIT_SYMBOL, 86),
+ (23, HUFFMAN_EMIT_SYMBOL, 86),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 86),
+ (2, HUFFMAN_EMIT_SYMBOL, 87),
+ (9, HUFFMAN_EMIT_SYMBOL, 87),
+ (23, HUFFMAN_EMIT_SYMBOL, 87),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 87),
+ (2, HUFFMAN_EMIT_SYMBOL, 89),
+ (9, HUFFMAN_EMIT_SYMBOL, 89),
+ (23, HUFFMAN_EMIT_SYMBOL, 89),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 89),
+
+ # Node 59
+ (3, HUFFMAN_EMIT_SYMBOL, 85),
+ (6, HUFFMAN_EMIT_SYMBOL, 85),
+ (10, HUFFMAN_EMIT_SYMBOL, 85),
+ (15, HUFFMAN_EMIT_SYMBOL, 85),
+ (24, HUFFMAN_EMIT_SYMBOL, 85),
+ (31, HUFFMAN_EMIT_SYMBOL, 85),
+ (41, HUFFMAN_EMIT_SYMBOL, 85),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 85),
+ (3, HUFFMAN_EMIT_SYMBOL, 86),
+ (6, HUFFMAN_EMIT_SYMBOL, 86),
+ (10, HUFFMAN_EMIT_SYMBOL, 86),
+ (15, HUFFMAN_EMIT_SYMBOL, 86),
+ (24, HUFFMAN_EMIT_SYMBOL, 86),
+ (31, HUFFMAN_EMIT_SYMBOL, 86),
+ (41, HUFFMAN_EMIT_SYMBOL, 86),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 86),
+
+ # Node 60
+ (3, HUFFMAN_EMIT_SYMBOL, 87),
+ (6, HUFFMAN_EMIT_SYMBOL, 87),
+ (10, HUFFMAN_EMIT_SYMBOL, 87),
+ (15, HUFFMAN_EMIT_SYMBOL, 87),
+ (24, HUFFMAN_EMIT_SYMBOL, 87),
+ (31, HUFFMAN_EMIT_SYMBOL, 87),
+ (41, HUFFMAN_EMIT_SYMBOL, 87),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 87),
+ (3, HUFFMAN_EMIT_SYMBOL, 89),
+ (6, HUFFMAN_EMIT_SYMBOL, 89),
+ (10, HUFFMAN_EMIT_SYMBOL, 89),
+ (15, HUFFMAN_EMIT_SYMBOL, 89),
+ (24, HUFFMAN_EMIT_SYMBOL, 89),
+ (31, HUFFMAN_EMIT_SYMBOL, 89),
+ (41, HUFFMAN_EMIT_SYMBOL, 89),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 89),
+
+ # Node 61
+ (2, HUFFMAN_EMIT_SYMBOL, 106),
+ (9, HUFFMAN_EMIT_SYMBOL, 106),
+ (23, HUFFMAN_EMIT_SYMBOL, 106),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 106),
+ (2, HUFFMAN_EMIT_SYMBOL, 107),
+ (9, HUFFMAN_EMIT_SYMBOL, 107),
+ (23, HUFFMAN_EMIT_SYMBOL, 107),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 107),
+ (2, HUFFMAN_EMIT_SYMBOL, 113),
+ (9, HUFFMAN_EMIT_SYMBOL, 113),
+ (23, HUFFMAN_EMIT_SYMBOL, 113),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 113),
+ (2, HUFFMAN_EMIT_SYMBOL, 118),
+ (9, HUFFMAN_EMIT_SYMBOL, 118),
+ (23, HUFFMAN_EMIT_SYMBOL, 118),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 118),
+
+ # Node 62
+ (3, HUFFMAN_EMIT_SYMBOL, 106),
+ (6, HUFFMAN_EMIT_SYMBOL, 106),
+ (10, HUFFMAN_EMIT_SYMBOL, 106),
+ (15, HUFFMAN_EMIT_SYMBOL, 106),
+ (24, HUFFMAN_EMIT_SYMBOL, 106),
+ (31, HUFFMAN_EMIT_SYMBOL, 106),
+ (41, HUFFMAN_EMIT_SYMBOL, 106),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 106),
+ (3, HUFFMAN_EMIT_SYMBOL, 107),
+ (6, HUFFMAN_EMIT_SYMBOL, 107),
+ (10, HUFFMAN_EMIT_SYMBOL, 107),
+ (15, HUFFMAN_EMIT_SYMBOL, 107),
+ (24, HUFFMAN_EMIT_SYMBOL, 107),
+ (31, HUFFMAN_EMIT_SYMBOL, 107),
+ (41, HUFFMAN_EMIT_SYMBOL, 107),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 107),
+
+ # Node 63
+ (3, HUFFMAN_EMIT_SYMBOL, 113),
+ (6, HUFFMAN_EMIT_SYMBOL, 113),
+ (10, HUFFMAN_EMIT_SYMBOL, 113),
+ (15, HUFFMAN_EMIT_SYMBOL, 113),
+ (24, HUFFMAN_EMIT_SYMBOL, 113),
+ (31, HUFFMAN_EMIT_SYMBOL, 113),
+ (41, HUFFMAN_EMIT_SYMBOL, 113),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 113),
+ (3, HUFFMAN_EMIT_SYMBOL, 118),
+ (6, HUFFMAN_EMIT_SYMBOL, 118),
+ (10, HUFFMAN_EMIT_SYMBOL, 118),
+ (15, HUFFMAN_EMIT_SYMBOL, 118),
+ (24, HUFFMAN_EMIT_SYMBOL, 118),
+ (31, HUFFMAN_EMIT_SYMBOL, 118),
+ (41, HUFFMAN_EMIT_SYMBOL, 118),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 118),
+
+ # Node 64
+ (1, HUFFMAN_EMIT_SYMBOL, 119),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 119),
+ (1, HUFFMAN_EMIT_SYMBOL, 120),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 120),
+ (1, HUFFMAN_EMIT_SYMBOL, 121),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 121),
+ (1, HUFFMAN_EMIT_SYMBOL, 122),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 122),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 38),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 42),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 44),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 59),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 88),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 90),
+ (75, 0, 0),
+ (78, 0, 0),
+
+ # Node 65
+ (2, HUFFMAN_EMIT_SYMBOL, 119),
+ (9, HUFFMAN_EMIT_SYMBOL, 119),
+ (23, HUFFMAN_EMIT_SYMBOL, 119),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 119),
+ (2, HUFFMAN_EMIT_SYMBOL, 120),
+ (9, HUFFMAN_EMIT_SYMBOL, 120),
+ (23, HUFFMAN_EMIT_SYMBOL, 120),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 120),
+ (2, HUFFMAN_EMIT_SYMBOL, 121),
+ (9, HUFFMAN_EMIT_SYMBOL, 121),
+ (23, HUFFMAN_EMIT_SYMBOL, 121),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 121),
+ (2, HUFFMAN_EMIT_SYMBOL, 122),
+ (9, HUFFMAN_EMIT_SYMBOL, 122),
+ (23, HUFFMAN_EMIT_SYMBOL, 122),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 122),
+
+ # Node 66
+ (3, HUFFMAN_EMIT_SYMBOL, 119),
+ (6, HUFFMAN_EMIT_SYMBOL, 119),
+ (10, HUFFMAN_EMIT_SYMBOL, 119),
+ (15, HUFFMAN_EMIT_SYMBOL, 119),
+ (24, HUFFMAN_EMIT_SYMBOL, 119),
+ (31, HUFFMAN_EMIT_SYMBOL, 119),
+ (41, HUFFMAN_EMIT_SYMBOL, 119),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 119),
+ (3, HUFFMAN_EMIT_SYMBOL, 120),
+ (6, HUFFMAN_EMIT_SYMBOL, 120),
+ (10, HUFFMAN_EMIT_SYMBOL, 120),
+ (15, HUFFMAN_EMIT_SYMBOL, 120),
+ (24, HUFFMAN_EMIT_SYMBOL, 120),
+ (31, HUFFMAN_EMIT_SYMBOL, 120),
+ (41, HUFFMAN_EMIT_SYMBOL, 120),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 120),
+
+ # Node 67
+ (3, HUFFMAN_EMIT_SYMBOL, 121),
+ (6, HUFFMAN_EMIT_SYMBOL, 121),
+ (10, HUFFMAN_EMIT_SYMBOL, 121),
+ (15, HUFFMAN_EMIT_SYMBOL, 121),
+ (24, HUFFMAN_EMIT_SYMBOL, 121),
+ (31, HUFFMAN_EMIT_SYMBOL, 121),
+ (41, HUFFMAN_EMIT_SYMBOL, 121),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 121),
+ (3, HUFFMAN_EMIT_SYMBOL, 122),
+ (6, HUFFMAN_EMIT_SYMBOL, 122),
+ (10, HUFFMAN_EMIT_SYMBOL, 122),
+ (15, HUFFMAN_EMIT_SYMBOL, 122),
+ (24, HUFFMAN_EMIT_SYMBOL, 122),
+ (31, HUFFMAN_EMIT_SYMBOL, 122),
+ (41, HUFFMAN_EMIT_SYMBOL, 122),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 122),
+
+ # Node 68
+ (1, HUFFMAN_EMIT_SYMBOL, 38),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 38),
+ (1, HUFFMAN_EMIT_SYMBOL, 42),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 42),
+ (1, HUFFMAN_EMIT_SYMBOL, 44),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 44),
+ (1, HUFFMAN_EMIT_SYMBOL, 59),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 59),
+ (1, HUFFMAN_EMIT_SYMBOL, 88),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 88),
+ (1, HUFFMAN_EMIT_SYMBOL, 90),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 90),
+ (76, 0, 0),
+ (77, 0, 0),
+ (79, 0, 0),
+ (81, 0, 0),
+
+ # Node 69
+ (2, HUFFMAN_EMIT_SYMBOL, 38),
+ (9, HUFFMAN_EMIT_SYMBOL, 38),
+ (23, HUFFMAN_EMIT_SYMBOL, 38),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 38),
+ (2, HUFFMAN_EMIT_SYMBOL, 42),
+ (9, HUFFMAN_EMIT_SYMBOL, 42),
+ (23, HUFFMAN_EMIT_SYMBOL, 42),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 42),
+ (2, HUFFMAN_EMIT_SYMBOL, 44),
+ (9, HUFFMAN_EMIT_SYMBOL, 44),
+ (23, HUFFMAN_EMIT_SYMBOL, 44),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 44),
+ (2, HUFFMAN_EMIT_SYMBOL, 59),
+ (9, HUFFMAN_EMIT_SYMBOL, 59),
+ (23, HUFFMAN_EMIT_SYMBOL, 59),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 59),
+
+ # Node 70
+ (3, HUFFMAN_EMIT_SYMBOL, 38),
+ (6, HUFFMAN_EMIT_SYMBOL, 38),
+ (10, HUFFMAN_EMIT_SYMBOL, 38),
+ (15, HUFFMAN_EMIT_SYMBOL, 38),
+ (24, HUFFMAN_EMIT_SYMBOL, 38),
+ (31, HUFFMAN_EMIT_SYMBOL, 38),
+ (41, HUFFMAN_EMIT_SYMBOL, 38),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 38),
+ (3, HUFFMAN_EMIT_SYMBOL, 42),
+ (6, HUFFMAN_EMIT_SYMBOL, 42),
+ (10, HUFFMAN_EMIT_SYMBOL, 42),
+ (15, HUFFMAN_EMIT_SYMBOL, 42),
+ (24, HUFFMAN_EMIT_SYMBOL, 42),
+ (31, HUFFMAN_EMIT_SYMBOL, 42),
+ (41, HUFFMAN_EMIT_SYMBOL, 42),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 42),
+
+ # Node 71
+ (3, HUFFMAN_EMIT_SYMBOL, 44),
+ (6, HUFFMAN_EMIT_SYMBOL, 44),
+ (10, HUFFMAN_EMIT_SYMBOL, 44),
+ (15, HUFFMAN_EMIT_SYMBOL, 44),
+ (24, HUFFMAN_EMIT_SYMBOL, 44),
+ (31, HUFFMAN_EMIT_SYMBOL, 44),
+ (41, HUFFMAN_EMIT_SYMBOL, 44),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 44),
+ (3, HUFFMAN_EMIT_SYMBOL, 59),
+ (6, HUFFMAN_EMIT_SYMBOL, 59),
+ (10, HUFFMAN_EMIT_SYMBOL, 59),
+ (15, HUFFMAN_EMIT_SYMBOL, 59),
+ (24, HUFFMAN_EMIT_SYMBOL, 59),
+ (31, HUFFMAN_EMIT_SYMBOL, 59),
+ (41, HUFFMAN_EMIT_SYMBOL, 59),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 59),
+
+ # Node 72
+ (2, HUFFMAN_EMIT_SYMBOL, 88),
+ (9, HUFFMAN_EMIT_SYMBOL, 88),
+ (23, HUFFMAN_EMIT_SYMBOL, 88),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 88),
+ (2, HUFFMAN_EMIT_SYMBOL, 90),
+ (9, HUFFMAN_EMIT_SYMBOL, 90),
+ (23, HUFFMAN_EMIT_SYMBOL, 90),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 90),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 33),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 34),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 40),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 41),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 63),
+ (80, 0, 0),
+ (82, 0, 0),
+ (84, 0, 0),
+
+ # Node 73
+ (3, HUFFMAN_EMIT_SYMBOL, 88),
+ (6, HUFFMAN_EMIT_SYMBOL, 88),
+ (10, HUFFMAN_EMIT_SYMBOL, 88),
+ (15, HUFFMAN_EMIT_SYMBOL, 88),
+ (24, HUFFMAN_EMIT_SYMBOL, 88),
+ (31, HUFFMAN_EMIT_SYMBOL, 88),
+ (41, HUFFMAN_EMIT_SYMBOL, 88),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 88),
+ (3, HUFFMAN_EMIT_SYMBOL, 90),
+ (6, HUFFMAN_EMIT_SYMBOL, 90),
+ (10, HUFFMAN_EMIT_SYMBOL, 90),
+ (15, HUFFMAN_EMIT_SYMBOL, 90),
+ (24, HUFFMAN_EMIT_SYMBOL, 90),
+ (31, HUFFMAN_EMIT_SYMBOL, 90),
+ (41, HUFFMAN_EMIT_SYMBOL, 90),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 90),
+
+ # Node 74
+ (1, HUFFMAN_EMIT_SYMBOL, 33),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 33),
+ (1, HUFFMAN_EMIT_SYMBOL, 34),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 34),
+ (1, HUFFMAN_EMIT_SYMBOL, 40),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 40),
+ (1, HUFFMAN_EMIT_SYMBOL, 41),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 41),
+ (1, HUFFMAN_EMIT_SYMBOL, 63),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 63),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 39),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 43),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 124),
+ (83, 0, 0),
+ (85, 0, 0),
+ (88, 0, 0),
+
+ # Node 75
+ (2, HUFFMAN_EMIT_SYMBOL, 33),
+ (9, HUFFMAN_EMIT_SYMBOL, 33),
+ (23, HUFFMAN_EMIT_SYMBOL, 33),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 33),
+ (2, HUFFMAN_EMIT_SYMBOL, 34),
+ (9, HUFFMAN_EMIT_SYMBOL, 34),
+ (23, HUFFMAN_EMIT_SYMBOL, 34),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 34),
+ (2, HUFFMAN_EMIT_SYMBOL, 40),
+ (9, HUFFMAN_EMIT_SYMBOL, 40),
+ (23, HUFFMAN_EMIT_SYMBOL, 40),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 40),
+ (2, HUFFMAN_EMIT_SYMBOL, 41),
+ (9, HUFFMAN_EMIT_SYMBOL, 41),
+ (23, HUFFMAN_EMIT_SYMBOL, 41),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 41),
+
+ # Node 76
+ (3, HUFFMAN_EMIT_SYMBOL, 33),
+ (6, HUFFMAN_EMIT_SYMBOL, 33),
+ (10, HUFFMAN_EMIT_SYMBOL, 33),
+ (15, HUFFMAN_EMIT_SYMBOL, 33),
+ (24, HUFFMAN_EMIT_SYMBOL, 33),
+ (31, HUFFMAN_EMIT_SYMBOL, 33),
+ (41, HUFFMAN_EMIT_SYMBOL, 33),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 33),
+ (3, HUFFMAN_EMIT_SYMBOL, 34),
+ (6, HUFFMAN_EMIT_SYMBOL, 34),
+ (10, HUFFMAN_EMIT_SYMBOL, 34),
+ (15, HUFFMAN_EMIT_SYMBOL, 34),
+ (24, HUFFMAN_EMIT_SYMBOL, 34),
+ (31, HUFFMAN_EMIT_SYMBOL, 34),
+ (41, HUFFMAN_EMIT_SYMBOL, 34),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 34),
+
+ # Node 77
+ (3, HUFFMAN_EMIT_SYMBOL, 40),
+ (6, HUFFMAN_EMIT_SYMBOL, 40),
+ (10, HUFFMAN_EMIT_SYMBOL, 40),
+ (15, HUFFMAN_EMIT_SYMBOL, 40),
+ (24, HUFFMAN_EMIT_SYMBOL, 40),
+ (31, HUFFMAN_EMIT_SYMBOL, 40),
+ (41, HUFFMAN_EMIT_SYMBOL, 40),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 40),
+ (3, HUFFMAN_EMIT_SYMBOL, 41),
+ (6, HUFFMAN_EMIT_SYMBOL, 41),
+ (10, HUFFMAN_EMIT_SYMBOL, 41),
+ (15, HUFFMAN_EMIT_SYMBOL, 41),
+ (24, HUFFMAN_EMIT_SYMBOL, 41),
+ (31, HUFFMAN_EMIT_SYMBOL, 41),
+ (41, HUFFMAN_EMIT_SYMBOL, 41),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 41),
+
+ # Node 78
+ (2, HUFFMAN_EMIT_SYMBOL, 63),
+ (9, HUFFMAN_EMIT_SYMBOL, 63),
+ (23, HUFFMAN_EMIT_SYMBOL, 63),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 63),
+ (1, HUFFMAN_EMIT_SYMBOL, 39),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 39),
+ (1, HUFFMAN_EMIT_SYMBOL, 43),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 43),
+ (1, HUFFMAN_EMIT_SYMBOL, 124),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 124),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 35),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 62),
+ (86, 0, 0),
+ (87, 0, 0),
+ (89, 0, 0),
+ (90, 0, 0),
+
+ # Node 79
+ (3, HUFFMAN_EMIT_SYMBOL, 63),
+ (6, HUFFMAN_EMIT_SYMBOL, 63),
+ (10, HUFFMAN_EMIT_SYMBOL, 63),
+ (15, HUFFMAN_EMIT_SYMBOL, 63),
+ (24, HUFFMAN_EMIT_SYMBOL, 63),
+ (31, HUFFMAN_EMIT_SYMBOL, 63),
+ (41, HUFFMAN_EMIT_SYMBOL, 63),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 63),
+ (2, HUFFMAN_EMIT_SYMBOL, 39),
+ (9, HUFFMAN_EMIT_SYMBOL, 39),
+ (23, HUFFMAN_EMIT_SYMBOL, 39),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 39),
+ (2, HUFFMAN_EMIT_SYMBOL, 43),
+ (9, HUFFMAN_EMIT_SYMBOL, 43),
+ (23, HUFFMAN_EMIT_SYMBOL, 43),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 43),
+
+ # Node 80
+ (3, HUFFMAN_EMIT_SYMBOL, 39),
+ (6, HUFFMAN_EMIT_SYMBOL, 39),
+ (10, HUFFMAN_EMIT_SYMBOL, 39),
+ (15, HUFFMAN_EMIT_SYMBOL, 39),
+ (24, HUFFMAN_EMIT_SYMBOL, 39),
+ (31, HUFFMAN_EMIT_SYMBOL, 39),
+ (41, HUFFMAN_EMIT_SYMBOL, 39),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 39),
+ (3, HUFFMAN_EMIT_SYMBOL, 43),
+ (6, HUFFMAN_EMIT_SYMBOL, 43),
+ (10, HUFFMAN_EMIT_SYMBOL, 43),
+ (15, HUFFMAN_EMIT_SYMBOL, 43),
+ (24, HUFFMAN_EMIT_SYMBOL, 43),
+ (31, HUFFMAN_EMIT_SYMBOL, 43),
+ (41, HUFFMAN_EMIT_SYMBOL, 43),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 43),
+
+ # Node 81
+ (2, HUFFMAN_EMIT_SYMBOL, 124),
+ (9, HUFFMAN_EMIT_SYMBOL, 124),
+ (23, HUFFMAN_EMIT_SYMBOL, 124),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 124),
+ (1, HUFFMAN_EMIT_SYMBOL, 35),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 35),
+ (1, HUFFMAN_EMIT_SYMBOL, 62),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 62),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 0),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 36),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 64),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 91),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 93),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 126),
+ (91, 0, 0),
+ (92, 0, 0),
+
+ # Node 82
+ (3, HUFFMAN_EMIT_SYMBOL, 124),
+ (6, HUFFMAN_EMIT_SYMBOL, 124),
+ (10, HUFFMAN_EMIT_SYMBOL, 124),
+ (15, HUFFMAN_EMIT_SYMBOL, 124),
+ (24, HUFFMAN_EMIT_SYMBOL, 124),
+ (31, HUFFMAN_EMIT_SYMBOL, 124),
+ (41, HUFFMAN_EMIT_SYMBOL, 124),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 124),
+ (2, HUFFMAN_EMIT_SYMBOL, 35),
+ (9, HUFFMAN_EMIT_SYMBOL, 35),
+ (23, HUFFMAN_EMIT_SYMBOL, 35),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 35),
+ (2, HUFFMAN_EMIT_SYMBOL, 62),
+ (9, HUFFMAN_EMIT_SYMBOL, 62),
+ (23, HUFFMAN_EMIT_SYMBOL, 62),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 62),
+
+ # Node 83
+ (3, HUFFMAN_EMIT_SYMBOL, 35),
+ (6, HUFFMAN_EMIT_SYMBOL, 35),
+ (10, HUFFMAN_EMIT_SYMBOL, 35),
+ (15, HUFFMAN_EMIT_SYMBOL, 35),
+ (24, HUFFMAN_EMIT_SYMBOL, 35),
+ (31, HUFFMAN_EMIT_SYMBOL, 35),
+ (41, HUFFMAN_EMIT_SYMBOL, 35),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 35),
+ (3, HUFFMAN_EMIT_SYMBOL, 62),
+ (6, HUFFMAN_EMIT_SYMBOL, 62),
+ (10, HUFFMAN_EMIT_SYMBOL, 62),
+ (15, HUFFMAN_EMIT_SYMBOL, 62),
+ (24, HUFFMAN_EMIT_SYMBOL, 62),
+ (31, HUFFMAN_EMIT_SYMBOL, 62),
+ (41, HUFFMAN_EMIT_SYMBOL, 62),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 62),
+
+ # Node 84
+ (1, HUFFMAN_EMIT_SYMBOL, 0),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 0),
+ (1, HUFFMAN_EMIT_SYMBOL, 36),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 36),
+ (1, HUFFMAN_EMIT_SYMBOL, 64),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 64),
+ (1, HUFFMAN_EMIT_SYMBOL, 91),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 91),
+ (1, HUFFMAN_EMIT_SYMBOL, 93),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 93),
+ (1, HUFFMAN_EMIT_SYMBOL, 126),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 126),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 94),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 125),
+ (93, 0, 0),
+ (94, 0, 0),
+
+ # Node 85
+ (2, HUFFMAN_EMIT_SYMBOL, 0),
+ (9, HUFFMAN_EMIT_SYMBOL, 0),
+ (23, HUFFMAN_EMIT_SYMBOL, 0),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 0),
+ (2, HUFFMAN_EMIT_SYMBOL, 36),
+ (9, HUFFMAN_EMIT_SYMBOL, 36),
+ (23, HUFFMAN_EMIT_SYMBOL, 36),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 36),
+ (2, HUFFMAN_EMIT_SYMBOL, 64),
+ (9, HUFFMAN_EMIT_SYMBOL, 64),
+ (23, HUFFMAN_EMIT_SYMBOL, 64),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 64),
+ (2, HUFFMAN_EMIT_SYMBOL, 91),
+ (9, HUFFMAN_EMIT_SYMBOL, 91),
+ (23, HUFFMAN_EMIT_SYMBOL, 91),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 91),
+
+ # Node 86
+ (3, HUFFMAN_EMIT_SYMBOL, 0),
+ (6, HUFFMAN_EMIT_SYMBOL, 0),
+ (10, HUFFMAN_EMIT_SYMBOL, 0),
+ (15, HUFFMAN_EMIT_SYMBOL, 0),
+ (24, HUFFMAN_EMIT_SYMBOL, 0),
+ (31, HUFFMAN_EMIT_SYMBOL, 0),
+ (41, HUFFMAN_EMIT_SYMBOL, 0),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 0),
+ (3, HUFFMAN_EMIT_SYMBOL, 36),
+ (6, HUFFMAN_EMIT_SYMBOL, 36),
+ (10, HUFFMAN_EMIT_SYMBOL, 36),
+ (15, HUFFMAN_EMIT_SYMBOL, 36),
+ (24, HUFFMAN_EMIT_SYMBOL, 36),
+ (31, HUFFMAN_EMIT_SYMBOL, 36),
+ (41, HUFFMAN_EMIT_SYMBOL, 36),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 36),
+
+ # Node 87
+ (3, HUFFMAN_EMIT_SYMBOL, 64),
+ (6, HUFFMAN_EMIT_SYMBOL, 64),
+ (10, HUFFMAN_EMIT_SYMBOL, 64),
+ (15, HUFFMAN_EMIT_SYMBOL, 64),
+ (24, HUFFMAN_EMIT_SYMBOL, 64),
+ (31, HUFFMAN_EMIT_SYMBOL, 64),
+ (41, HUFFMAN_EMIT_SYMBOL, 64),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 64),
+ (3, HUFFMAN_EMIT_SYMBOL, 91),
+ (6, HUFFMAN_EMIT_SYMBOL, 91),
+ (10, HUFFMAN_EMIT_SYMBOL, 91),
+ (15, HUFFMAN_EMIT_SYMBOL, 91),
+ (24, HUFFMAN_EMIT_SYMBOL, 91),
+ (31, HUFFMAN_EMIT_SYMBOL, 91),
+ (41, HUFFMAN_EMIT_SYMBOL, 91),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 91),
+
+ # Node 88
+ (2, HUFFMAN_EMIT_SYMBOL, 93),
+ (9, HUFFMAN_EMIT_SYMBOL, 93),
+ (23, HUFFMAN_EMIT_SYMBOL, 93),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 93),
+ (2, HUFFMAN_EMIT_SYMBOL, 126),
+ (9, HUFFMAN_EMIT_SYMBOL, 126),
+ (23, HUFFMAN_EMIT_SYMBOL, 126),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 126),
+ (1, HUFFMAN_EMIT_SYMBOL, 94),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 94),
+ (1, HUFFMAN_EMIT_SYMBOL, 125),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 125),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 60),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 96),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 123),
+ (95, 0, 0),
+
+ # Node 89
+ (3, HUFFMAN_EMIT_SYMBOL, 93),
+ (6, HUFFMAN_EMIT_SYMBOL, 93),
+ (10, HUFFMAN_EMIT_SYMBOL, 93),
+ (15, HUFFMAN_EMIT_SYMBOL, 93),
+ (24, HUFFMAN_EMIT_SYMBOL, 93),
+ (31, HUFFMAN_EMIT_SYMBOL, 93),
+ (41, HUFFMAN_EMIT_SYMBOL, 93),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 93),
+ (3, HUFFMAN_EMIT_SYMBOL, 126),
+ (6, HUFFMAN_EMIT_SYMBOL, 126),
+ (10, HUFFMAN_EMIT_SYMBOL, 126),
+ (15, HUFFMAN_EMIT_SYMBOL, 126),
+ (24, HUFFMAN_EMIT_SYMBOL, 126),
+ (31, HUFFMAN_EMIT_SYMBOL, 126),
+ (41, HUFFMAN_EMIT_SYMBOL, 126),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 126),
+
+ # Node 90
+ (2, HUFFMAN_EMIT_SYMBOL, 94),
+ (9, HUFFMAN_EMIT_SYMBOL, 94),
+ (23, HUFFMAN_EMIT_SYMBOL, 94),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 94),
+ (2, HUFFMAN_EMIT_SYMBOL, 125),
+ (9, HUFFMAN_EMIT_SYMBOL, 125),
+ (23, HUFFMAN_EMIT_SYMBOL, 125),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 125),
+ (1, HUFFMAN_EMIT_SYMBOL, 60),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 60),
+ (1, HUFFMAN_EMIT_SYMBOL, 96),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 96),
+ (1, HUFFMAN_EMIT_SYMBOL, 123),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 123),
+ (96, 0, 0),
+ (110, 0, 0),
+
+ # Node 91
+ (3, HUFFMAN_EMIT_SYMBOL, 94),
+ (6, HUFFMAN_EMIT_SYMBOL, 94),
+ (10, HUFFMAN_EMIT_SYMBOL, 94),
+ (15, HUFFMAN_EMIT_SYMBOL, 94),
+ (24, HUFFMAN_EMIT_SYMBOL, 94),
+ (31, HUFFMAN_EMIT_SYMBOL, 94),
+ (41, HUFFMAN_EMIT_SYMBOL, 94),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 94),
+ (3, HUFFMAN_EMIT_SYMBOL, 125),
+ (6, HUFFMAN_EMIT_SYMBOL, 125),
+ (10, HUFFMAN_EMIT_SYMBOL, 125),
+ (15, HUFFMAN_EMIT_SYMBOL, 125),
+ (24, HUFFMAN_EMIT_SYMBOL, 125),
+ (31, HUFFMAN_EMIT_SYMBOL, 125),
+ (41, HUFFMAN_EMIT_SYMBOL, 125),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 125),
+
+ # Node 92
+ (2, HUFFMAN_EMIT_SYMBOL, 60),
+ (9, HUFFMAN_EMIT_SYMBOL, 60),
+ (23, HUFFMAN_EMIT_SYMBOL, 60),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 60),
+ (2, HUFFMAN_EMIT_SYMBOL, 96),
+ (9, HUFFMAN_EMIT_SYMBOL, 96),
+ (23, HUFFMAN_EMIT_SYMBOL, 96),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 96),
+ (2, HUFFMAN_EMIT_SYMBOL, 123),
+ (9, HUFFMAN_EMIT_SYMBOL, 123),
+ (23, HUFFMAN_EMIT_SYMBOL, 123),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 123),
+ (97, 0, 0),
+ (101, 0, 0),
+ (111, 0, 0),
+ (133, 0, 0),
+
+ # Node 93
+ (3, HUFFMAN_EMIT_SYMBOL, 60),
+ (6, HUFFMAN_EMIT_SYMBOL, 60),
+ (10, HUFFMAN_EMIT_SYMBOL, 60),
+ (15, HUFFMAN_EMIT_SYMBOL, 60),
+ (24, HUFFMAN_EMIT_SYMBOL, 60),
+ (31, HUFFMAN_EMIT_SYMBOL, 60),
+ (41, HUFFMAN_EMIT_SYMBOL, 60),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 60),
+ (3, HUFFMAN_EMIT_SYMBOL, 96),
+ (6, HUFFMAN_EMIT_SYMBOL, 96),
+ (10, HUFFMAN_EMIT_SYMBOL, 96),
+ (15, HUFFMAN_EMIT_SYMBOL, 96),
+ (24, HUFFMAN_EMIT_SYMBOL, 96),
+ (31, HUFFMAN_EMIT_SYMBOL, 96),
+ (41, HUFFMAN_EMIT_SYMBOL, 96),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 96),
+
+ # Node 94
+ (3, HUFFMAN_EMIT_SYMBOL, 123),
+ (6, HUFFMAN_EMIT_SYMBOL, 123),
+ (10, HUFFMAN_EMIT_SYMBOL, 123),
+ (15, HUFFMAN_EMIT_SYMBOL, 123),
+ (24, HUFFMAN_EMIT_SYMBOL, 123),
+ (31, HUFFMAN_EMIT_SYMBOL, 123),
+ (41, HUFFMAN_EMIT_SYMBOL, 123),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 123),
+ (98, 0, 0),
+ (99, 0, 0),
+ (102, 0, 0),
+ (105, 0, 0),
+ (112, 0, 0),
+ (119, 0, 0),
+ (134, 0, 0),
+ (153, 0, 0),
+
+ # Node 95
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 92),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 195),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 208),
+ (100, 0, 0),
+ (103, 0, 0),
+ (104, 0, 0),
+ (106, 0, 0),
+ (107, 0, 0),
+ (113, 0, 0),
+ (116, 0, 0),
+ (120, 0, 0),
+ (126, 0, 0),
+ (135, 0, 0),
+ (142, 0, 0),
+ (154, 0, 0),
+ (169, 0, 0),
+
+ # Node 96
+ (1, HUFFMAN_EMIT_SYMBOL, 92),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 92),
+ (1, HUFFMAN_EMIT_SYMBOL, 195),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 195),
+ (1, HUFFMAN_EMIT_SYMBOL, 208),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 208),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 128),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 130),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 131),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 162),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 184),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 194),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 224),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 226),
+ (108, 0, 0),
+ (109, 0, 0),
+
+ # Node 97
+ (2, HUFFMAN_EMIT_SYMBOL, 92),
+ (9, HUFFMAN_EMIT_SYMBOL, 92),
+ (23, HUFFMAN_EMIT_SYMBOL, 92),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 92),
+ (2, HUFFMAN_EMIT_SYMBOL, 195),
+ (9, HUFFMAN_EMIT_SYMBOL, 195),
+ (23, HUFFMAN_EMIT_SYMBOL, 195),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 195),
+ (2, HUFFMAN_EMIT_SYMBOL, 208),
+ (9, HUFFMAN_EMIT_SYMBOL, 208),
+ (23, HUFFMAN_EMIT_SYMBOL, 208),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 208),
+ (1, HUFFMAN_EMIT_SYMBOL, 128),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 128),
+ (1, HUFFMAN_EMIT_SYMBOL, 130),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 130),
+
+ # Node 98
+ (3, HUFFMAN_EMIT_SYMBOL, 92),
+ (6, HUFFMAN_EMIT_SYMBOL, 92),
+ (10, HUFFMAN_EMIT_SYMBOL, 92),
+ (15, HUFFMAN_EMIT_SYMBOL, 92),
+ (24, HUFFMAN_EMIT_SYMBOL, 92),
+ (31, HUFFMAN_EMIT_SYMBOL, 92),
+ (41, HUFFMAN_EMIT_SYMBOL, 92),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 92),
+ (3, HUFFMAN_EMIT_SYMBOL, 195),
+ (6, HUFFMAN_EMIT_SYMBOL, 195),
+ (10, HUFFMAN_EMIT_SYMBOL, 195),
+ (15, HUFFMAN_EMIT_SYMBOL, 195),
+ (24, HUFFMAN_EMIT_SYMBOL, 195),
+ (31, HUFFMAN_EMIT_SYMBOL, 195),
+ (41, HUFFMAN_EMIT_SYMBOL, 195),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 195),
+
+ # Node 99
+ (3, HUFFMAN_EMIT_SYMBOL, 208),
+ (6, HUFFMAN_EMIT_SYMBOL, 208),
+ (10, HUFFMAN_EMIT_SYMBOL, 208),
+ (15, HUFFMAN_EMIT_SYMBOL, 208),
+ (24, HUFFMAN_EMIT_SYMBOL, 208),
+ (31, HUFFMAN_EMIT_SYMBOL, 208),
+ (41, HUFFMAN_EMIT_SYMBOL, 208),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 208),
+ (2, HUFFMAN_EMIT_SYMBOL, 128),
+ (9, HUFFMAN_EMIT_SYMBOL, 128),
+ (23, HUFFMAN_EMIT_SYMBOL, 128),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 128),
+ (2, HUFFMAN_EMIT_SYMBOL, 130),
+ (9, HUFFMAN_EMIT_SYMBOL, 130),
+ (23, HUFFMAN_EMIT_SYMBOL, 130),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 130),
+
+ # Node 100
+ (3, HUFFMAN_EMIT_SYMBOL, 128),
+ (6, HUFFMAN_EMIT_SYMBOL, 128),
+ (10, HUFFMAN_EMIT_SYMBOL, 128),
+ (15, HUFFMAN_EMIT_SYMBOL, 128),
+ (24, HUFFMAN_EMIT_SYMBOL, 128),
+ (31, HUFFMAN_EMIT_SYMBOL, 128),
+ (41, HUFFMAN_EMIT_SYMBOL, 128),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 128),
+ (3, HUFFMAN_EMIT_SYMBOL, 130),
+ (6, HUFFMAN_EMIT_SYMBOL, 130),
+ (10, HUFFMAN_EMIT_SYMBOL, 130),
+ (15, HUFFMAN_EMIT_SYMBOL, 130),
+ (24, HUFFMAN_EMIT_SYMBOL, 130),
+ (31, HUFFMAN_EMIT_SYMBOL, 130),
+ (41, HUFFMAN_EMIT_SYMBOL, 130),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 130),
+
+ # Node 101
+ (1, HUFFMAN_EMIT_SYMBOL, 131),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 131),
+ (1, HUFFMAN_EMIT_SYMBOL, 162),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 162),
+ (1, HUFFMAN_EMIT_SYMBOL, 184),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 184),
+ (1, HUFFMAN_EMIT_SYMBOL, 194),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 194),
+ (1, HUFFMAN_EMIT_SYMBOL, 224),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 224),
+ (1, HUFFMAN_EMIT_SYMBOL, 226),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 226),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 153),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 161),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 167),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 172),
+
+ # Node 102
+ (2, HUFFMAN_EMIT_SYMBOL, 131),
+ (9, HUFFMAN_EMIT_SYMBOL, 131),
+ (23, HUFFMAN_EMIT_SYMBOL, 131),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 131),
+ (2, HUFFMAN_EMIT_SYMBOL, 162),
+ (9, HUFFMAN_EMIT_SYMBOL, 162),
+ (23, HUFFMAN_EMIT_SYMBOL, 162),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 162),
+ (2, HUFFMAN_EMIT_SYMBOL, 184),
+ (9, HUFFMAN_EMIT_SYMBOL, 184),
+ (23, HUFFMAN_EMIT_SYMBOL, 184),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 184),
+ (2, HUFFMAN_EMIT_SYMBOL, 194),
+ (9, HUFFMAN_EMIT_SYMBOL, 194),
+ (23, HUFFMAN_EMIT_SYMBOL, 194),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 194),
+
+ # Node 103
+ (3, HUFFMAN_EMIT_SYMBOL, 131),
+ (6, HUFFMAN_EMIT_SYMBOL, 131),
+ (10, HUFFMAN_EMIT_SYMBOL, 131),
+ (15, HUFFMAN_EMIT_SYMBOL, 131),
+ (24, HUFFMAN_EMIT_SYMBOL, 131),
+ (31, HUFFMAN_EMIT_SYMBOL, 131),
+ (41, HUFFMAN_EMIT_SYMBOL, 131),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 131),
+ (3, HUFFMAN_EMIT_SYMBOL, 162),
+ (6, HUFFMAN_EMIT_SYMBOL, 162),
+ (10, HUFFMAN_EMIT_SYMBOL, 162),
+ (15, HUFFMAN_EMIT_SYMBOL, 162),
+ (24, HUFFMAN_EMIT_SYMBOL, 162),
+ (31, HUFFMAN_EMIT_SYMBOL, 162),
+ (41, HUFFMAN_EMIT_SYMBOL, 162),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 162),
+
+ # Node 104
+ (3, HUFFMAN_EMIT_SYMBOL, 184),
+ (6, HUFFMAN_EMIT_SYMBOL, 184),
+ (10, HUFFMAN_EMIT_SYMBOL, 184),
+ (15, HUFFMAN_EMIT_SYMBOL, 184),
+ (24, HUFFMAN_EMIT_SYMBOL, 184),
+ (31, HUFFMAN_EMIT_SYMBOL, 184),
+ (41, HUFFMAN_EMIT_SYMBOL, 184),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 184),
+ (3, HUFFMAN_EMIT_SYMBOL, 194),
+ (6, HUFFMAN_EMIT_SYMBOL, 194),
+ (10, HUFFMAN_EMIT_SYMBOL, 194),
+ (15, HUFFMAN_EMIT_SYMBOL, 194),
+ (24, HUFFMAN_EMIT_SYMBOL, 194),
+ (31, HUFFMAN_EMIT_SYMBOL, 194),
+ (41, HUFFMAN_EMIT_SYMBOL, 194),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 194),
+
+ # Node 105
+ (2, HUFFMAN_EMIT_SYMBOL, 224),
+ (9, HUFFMAN_EMIT_SYMBOL, 224),
+ (23, HUFFMAN_EMIT_SYMBOL, 224),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 224),
+ (2, HUFFMAN_EMIT_SYMBOL, 226),
+ (9, HUFFMAN_EMIT_SYMBOL, 226),
+ (23, HUFFMAN_EMIT_SYMBOL, 226),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 226),
+ (1, HUFFMAN_EMIT_SYMBOL, 153),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 153),
+ (1, HUFFMAN_EMIT_SYMBOL, 161),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 161),
+ (1, HUFFMAN_EMIT_SYMBOL, 167),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 167),
+ (1, HUFFMAN_EMIT_SYMBOL, 172),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 172),
+
+ # Node 106
+ (3, HUFFMAN_EMIT_SYMBOL, 224),
+ (6, HUFFMAN_EMIT_SYMBOL, 224),
+ (10, HUFFMAN_EMIT_SYMBOL, 224),
+ (15, HUFFMAN_EMIT_SYMBOL, 224),
+ (24, HUFFMAN_EMIT_SYMBOL, 224),
+ (31, HUFFMAN_EMIT_SYMBOL, 224),
+ (41, HUFFMAN_EMIT_SYMBOL, 224),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 224),
+ (3, HUFFMAN_EMIT_SYMBOL, 226),
+ (6, HUFFMAN_EMIT_SYMBOL, 226),
+ (10, HUFFMAN_EMIT_SYMBOL, 226),
+ (15, HUFFMAN_EMIT_SYMBOL, 226),
+ (24, HUFFMAN_EMIT_SYMBOL, 226),
+ (31, HUFFMAN_EMIT_SYMBOL, 226),
+ (41, HUFFMAN_EMIT_SYMBOL, 226),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 226),
+
+ # Node 107
+ (2, HUFFMAN_EMIT_SYMBOL, 153),
+ (9, HUFFMAN_EMIT_SYMBOL, 153),
+ (23, HUFFMAN_EMIT_SYMBOL, 153),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 153),
+ (2, HUFFMAN_EMIT_SYMBOL, 161),
+ (9, HUFFMAN_EMIT_SYMBOL, 161),
+ (23, HUFFMAN_EMIT_SYMBOL, 161),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 161),
+ (2, HUFFMAN_EMIT_SYMBOL, 167),
+ (9, HUFFMAN_EMIT_SYMBOL, 167),
+ (23, HUFFMAN_EMIT_SYMBOL, 167),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 167),
+ (2, HUFFMAN_EMIT_SYMBOL, 172),
+ (9, HUFFMAN_EMIT_SYMBOL, 172),
+ (23, HUFFMAN_EMIT_SYMBOL, 172),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 172),
+
+ # Node 108
+ (3, HUFFMAN_EMIT_SYMBOL, 153),
+ (6, HUFFMAN_EMIT_SYMBOL, 153),
+ (10, HUFFMAN_EMIT_SYMBOL, 153),
+ (15, HUFFMAN_EMIT_SYMBOL, 153),
+ (24, HUFFMAN_EMIT_SYMBOL, 153),
+ (31, HUFFMAN_EMIT_SYMBOL, 153),
+ (41, HUFFMAN_EMIT_SYMBOL, 153),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 153),
+ (3, HUFFMAN_EMIT_SYMBOL, 161),
+ (6, HUFFMAN_EMIT_SYMBOL, 161),
+ (10, HUFFMAN_EMIT_SYMBOL, 161),
+ (15, HUFFMAN_EMIT_SYMBOL, 161),
+ (24, HUFFMAN_EMIT_SYMBOL, 161),
+ (31, HUFFMAN_EMIT_SYMBOL, 161),
+ (41, HUFFMAN_EMIT_SYMBOL, 161),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 161),
+
+ # Node 109
+ (3, HUFFMAN_EMIT_SYMBOL, 167),
+ (6, HUFFMAN_EMIT_SYMBOL, 167),
+ (10, HUFFMAN_EMIT_SYMBOL, 167),
+ (15, HUFFMAN_EMIT_SYMBOL, 167),
+ (24, HUFFMAN_EMIT_SYMBOL, 167),
+ (31, HUFFMAN_EMIT_SYMBOL, 167),
+ (41, HUFFMAN_EMIT_SYMBOL, 167),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 167),
+ (3, HUFFMAN_EMIT_SYMBOL, 172),
+ (6, HUFFMAN_EMIT_SYMBOL, 172),
+ (10, HUFFMAN_EMIT_SYMBOL, 172),
+ (15, HUFFMAN_EMIT_SYMBOL, 172),
+ (24, HUFFMAN_EMIT_SYMBOL, 172),
+ (31, HUFFMAN_EMIT_SYMBOL, 172),
+ (41, HUFFMAN_EMIT_SYMBOL, 172),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 172),
+
+ # Node 110
+ (114, 0, 0),
+ (115, 0, 0),
+ (117, 0, 0),
+ (118, 0, 0),
+ (121, 0, 0),
+ (123, 0, 0),
+ (127, 0, 0),
+ (130, 0, 0),
+ (136, 0, 0),
+ (139, 0, 0),
+ (143, 0, 0),
+ (146, 0, 0),
+ (155, 0, 0),
+ (162, 0, 0),
+ (170, 0, 0),
+ (180, 0, 0),
+
+ # Node 111
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 176),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 177),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 179),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 209),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 216),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 217),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 227),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 229),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 230),
+ (122, 0, 0),
+ (124, 0, 0),
+ (125, 0, 0),
+ (128, 0, 0),
+ (129, 0, 0),
+ (131, 0, 0),
+ (132, 0, 0),
+
+ # Node 112
+ (1, HUFFMAN_EMIT_SYMBOL, 176),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 176),
+ (1, HUFFMAN_EMIT_SYMBOL, 177),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 177),
+ (1, HUFFMAN_EMIT_SYMBOL, 179),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 179),
+ (1, HUFFMAN_EMIT_SYMBOL, 209),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 209),
+ (1, HUFFMAN_EMIT_SYMBOL, 216),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 216),
+ (1, HUFFMAN_EMIT_SYMBOL, 217),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 217),
+ (1, HUFFMAN_EMIT_SYMBOL, 227),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 227),
+ (1, HUFFMAN_EMIT_SYMBOL, 229),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 229),
+
+ # Node 113
+ (2, HUFFMAN_EMIT_SYMBOL, 176),
+ (9, HUFFMAN_EMIT_SYMBOL, 176),
+ (23, HUFFMAN_EMIT_SYMBOL, 176),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 176),
+ (2, HUFFMAN_EMIT_SYMBOL, 177),
+ (9, HUFFMAN_EMIT_SYMBOL, 177),
+ (23, HUFFMAN_EMIT_SYMBOL, 177),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 177),
+ (2, HUFFMAN_EMIT_SYMBOL, 179),
+ (9, HUFFMAN_EMIT_SYMBOL, 179),
+ (23, HUFFMAN_EMIT_SYMBOL, 179),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 179),
+ (2, HUFFMAN_EMIT_SYMBOL, 209),
+ (9, HUFFMAN_EMIT_SYMBOL, 209),
+ (23, HUFFMAN_EMIT_SYMBOL, 209),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 209),
+
+ # Node 114
+ (3, HUFFMAN_EMIT_SYMBOL, 176),
+ (6, HUFFMAN_EMIT_SYMBOL, 176),
+ (10, HUFFMAN_EMIT_SYMBOL, 176),
+ (15, HUFFMAN_EMIT_SYMBOL, 176),
+ (24, HUFFMAN_EMIT_SYMBOL, 176),
+ (31, HUFFMAN_EMIT_SYMBOL, 176),
+ (41, HUFFMAN_EMIT_SYMBOL, 176),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 176),
+ (3, HUFFMAN_EMIT_SYMBOL, 177),
+ (6, HUFFMAN_EMIT_SYMBOL, 177),
+ (10, HUFFMAN_EMIT_SYMBOL, 177),
+ (15, HUFFMAN_EMIT_SYMBOL, 177),
+ (24, HUFFMAN_EMIT_SYMBOL, 177),
+ (31, HUFFMAN_EMIT_SYMBOL, 177),
+ (41, HUFFMAN_EMIT_SYMBOL, 177),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 177),
+
+ # Node 115
+ (3, HUFFMAN_EMIT_SYMBOL, 179),
+ (6, HUFFMAN_EMIT_SYMBOL, 179),
+ (10, HUFFMAN_EMIT_SYMBOL, 179),
+ (15, HUFFMAN_EMIT_SYMBOL, 179),
+ (24, HUFFMAN_EMIT_SYMBOL, 179),
+ (31, HUFFMAN_EMIT_SYMBOL, 179),
+ (41, HUFFMAN_EMIT_SYMBOL, 179),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 179),
+ (3, HUFFMAN_EMIT_SYMBOL, 209),
+ (6, HUFFMAN_EMIT_SYMBOL, 209),
+ (10, HUFFMAN_EMIT_SYMBOL, 209),
+ (15, HUFFMAN_EMIT_SYMBOL, 209),
+ (24, HUFFMAN_EMIT_SYMBOL, 209),
+ (31, HUFFMAN_EMIT_SYMBOL, 209),
+ (41, HUFFMAN_EMIT_SYMBOL, 209),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 209),
+
+ # Node 116
+ (2, HUFFMAN_EMIT_SYMBOL, 216),
+ (9, HUFFMAN_EMIT_SYMBOL, 216),
+ (23, HUFFMAN_EMIT_SYMBOL, 216),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 216),
+ (2, HUFFMAN_EMIT_SYMBOL, 217),
+ (9, HUFFMAN_EMIT_SYMBOL, 217),
+ (23, HUFFMAN_EMIT_SYMBOL, 217),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 217),
+ (2, HUFFMAN_EMIT_SYMBOL, 227),
+ (9, HUFFMAN_EMIT_SYMBOL, 227),
+ (23, HUFFMAN_EMIT_SYMBOL, 227),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 227),
+ (2, HUFFMAN_EMIT_SYMBOL, 229),
+ (9, HUFFMAN_EMIT_SYMBOL, 229),
+ (23, HUFFMAN_EMIT_SYMBOL, 229),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 229),
+
+ # Node 117
+ (3, HUFFMAN_EMIT_SYMBOL, 216),
+ (6, HUFFMAN_EMIT_SYMBOL, 216),
+ (10, HUFFMAN_EMIT_SYMBOL, 216),
+ (15, HUFFMAN_EMIT_SYMBOL, 216),
+ (24, HUFFMAN_EMIT_SYMBOL, 216),
+ (31, HUFFMAN_EMIT_SYMBOL, 216),
+ (41, HUFFMAN_EMIT_SYMBOL, 216),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 216),
+ (3, HUFFMAN_EMIT_SYMBOL, 217),
+ (6, HUFFMAN_EMIT_SYMBOL, 217),
+ (10, HUFFMAN_EMIT_SYMBOL, 217),
+ (15, HUFFMAN_EMIT_SYMBOL, 217),
+ (24, HUFFMAN_EMIT_SYMBOL, 217),
+ (31, HUFFMAN_EMIT_SYMBOL, 217),
+ (41, HUFFMAN_EMIT_SYMBOL, 217),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 217),
+
+ # Node 118
+ (3, HUFFMAN_EMIT_SYMBOL, 227),
+ (6, HUFFMAN_EMIT_SYMBOL, 227),
+ (10, HUFFMAN_EMIT_SYMBOL, 227),
+ (15, HUFFMAN_EMIT_SYMBOL, 227),
+ (24, HUFFMAN_EMIT_SYMBOL, 227),
+ (31, HUFFMAN_EMIT_SYMBOL, 227),
+ (41, HUFFMAN_EMIT_SYMBOL, 227),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 227),
+ (3, HUFFMAN_EMIT_SYMBOL, 229),
+ (6, HUFFMAN_EMIT_SYMBOL, 229),
+ (10, HUFFMAN_EMIT_SYMBOL, 229),
+ (15, HUFFMAN_EMIT_SYMBOL, 229),
+ (24, HUFFMAN_EMIT_SYMBOL, 229),
+ (31, HUFFMAN_EMIT_SYMBOL, 229),
+ (41, HUFFMAN_EMIT_SYMBOL, 229),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 229),
+
+ # Node 119
+ (1, HUFFMAN_EMIT_SYMBOL, 230),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 230),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 129),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 132),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 133),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 134),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 136),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 146),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 154),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 156),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 160),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 163),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 164),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 169),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 170),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 173),
+
+ # Node 120
+ (2, HUFFMAN_EMIT_SYMBOL, 230),
+ (9, HUFFMAN_EMIT_SYMBOL, 230),
+ (23, HUFFMAN_EMIT_SYMBOL, 230),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 230),
+ (1, HUFFMAN_EMIT_SYMBOL, 129),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 129),
+ (1, HUFFMAN_EMIT_SYMBOL, 132),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 132),
+ (1, HUFFMAN_EMIT_SYMBOL, 133),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 133),
+ (1, HUFFMAN_EMIT_SYMBOL, 134),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 134),
+ (1, HUFFMAN_EMIT_SYMBOL, 136),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 136),
+ (1, HUFFMAN_EMIT_SYMBOL, 146),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 146),
+
+ # Node 121
+ (3, HUFFMAN_EMIT_SYMBOL, 230),
+ (6, HUFFMAN_EMIT_SYMBOL, 230),
+ (10, HUFFMAN_EMIT_SYMBOL, 230),
+ (15, HUFFMAN_EMIT_SYMBOL, 230),
+ (24, HUFFMAN_EMIT_SYMBOL, 230),
+ (31, HUFFMAN_EMIT_SYMBOL, 230),
+ (41, HUFFMAN_EMIT_SYMBOL, 230),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 230),
+ (2, HUFFMAN_EMIT_SYMBOL, 129),
+ (9, HUFFMAN_EMIT_SYMBOL, 129),
+ (23, HUFFMAN_EMIT_SYMBOL, 129),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 129),
+ (2, HUFFMAN_EMIT_SYMBOL, 132),
+ (9, HUFFMAN_EMIT_SYMBOL, 132),
+ (23, HUFFMAN_EMIT_SYMBOL, 132),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 132),
+
+ # Node 122
+ (3, HUFFMAN_EMIT_SYMBOL, 129),
+ (6, HUFFMAN_EMIT_SYMBOL, 129),
+ (10, HUFFMAN_EMIT_SYMBOL, 129),
+ (15, HUFFMAN_EMIT_SYMBOL, 129),
+ (24, HUFFMAN_EMIT_SYMBOL, 129),
+ (31, HUFFMAN_EMIT_SYMBOL, 129),
+ (41, HUFFMAN_EMIT_SYMBOL, 129),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 129),
+ (3, HUFFMAN_EMIT_SYMBOL, 132),
+ (6, HUFFMAN_EMIT_SYMBOL, 132),
+ (10, HUFFMAN_EMIT_SYMBOL, 132),
+ (15, HUFFMAN_EMIT_SYMBOL, 132),
+ (24, HUFFMAN_EMIT_SYMBOL, 132),
+ (31, HUFFMAN_EMIT_SYMBOL, 132),
+ (41, HUFFMAN_EMIT_SYMBOL, 132),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 132),
+
+ # Node 123
+ (2, HUFFMAN_EMIT_SYMBOL, 133),
+ (9, HUFFMAN_EMIT_SYMBOL, 133),
+ (23, HUFFMAN_EMIT_SYMBOL, 133),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 133),
+ (2, HUFFMAN_EMIT_SYMBOL, 134),
+ (9, HUFFMAN_EMIT_SYMBOL, 134),
+ (23, HUFFMAN_EMIT_SYMBOL, 134),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 134),
+ (2, HUFFMAN_EMIT_SYMBOL, 136),
+ (9, HUFFMAN_EMIT_SYMBOL, 136),
+ (23, HUFFMAN_EMIT_SYMBOL, 136),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 136),
+ (2, HUFFMAN_EMIT_SYMBOL, 146),
+ (9, HUFFMAN_EMIT_SYMBOL, 146),
+ (23, HUFFMAN_EMIT_SYMBOL, 146),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 146),
+
+ # Node 124
+ (3, HUFFMAN_EMIT_SYMBOL, 133),
+ (6, HUFFMAN_EMIT_SYMBOL, 133),
+ (10, HUFFMAN_EMIT_SYMBOL, 133),
+ (15, HUFFMAN_EMIT_SYMBOL, 133),
+ (24, HUFFMAN_EMIT_SYMBOL, 133),
+ (31, HUFFMAN_EMIT_SYMBOL, 133),
+ (41, HUFFMAN_EMIT_SYMBOL, 133),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 133),
+ (3, HUFFMAN_EMIT_SYMBOL, 134),
+ (6, HUFFMAN_EMIT_SYMBOL, 134),
+ (10, HUFFMAN_EMIT_SYMBOL, 134),
+ (15, HUFFMAN_EMIT_SYMBOL, 134),
+ (24, HUFFMAN_EMIT_SYMBOL, 134),
+ (31, HUFFMAN_EMIT_SYMBOL, 134),
+ (41, HUFFMAN_EMIT_SYMBOL, 134),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 134),
+
+ # Node 125
+ (3, HUFFMAN_EMIT_SYMBOL, 136),
+ (6, HUFFMAN_EMIT_SYMBOL, 136),
+ (10, HUFFMAN_EMIT_SYMBOL, 136),
+ (15, HUFFMAN_EMIT_SYMBOL, 136),
+ (24, HUFFMAN_EMIT_SYMBOL, 136),
+ (31, HUFFMAN_EMIT_SYMBOL, 136),
+ (41, HUFFMAN_EMIT_SYMBOL, 136),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 136),
+ (3, HUFFMAN_EMIT_SYMBOL, 146),
+ (6, HUFFMAN_EMIT_SYMBOL, 146),
+ (10, HUFFMAN_EMIT_SYMBOL, 146),
+ (15, HUFFMAN_EMIT_SYMBOL, 146),
+ (24, HUFFMAN_EMIT_SYMBOL, 146),
+ (31, HUFFMAN_EMIT_SYMBOL, 146),
+ (41, HUFFMAN_EMIT_SYMBOL, 146),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 146),
+
+ # Node 126
+ (1, HUFFMAN_EMIT_SYMBOL, 154),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 154),
+ (1, HUFFMAN_EMIT_SYMBOL, 156),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 156),
+ (1, HUFFMAN_EMIT_SYMBOL, 160),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 160),
+ (1, HUFFMAN_EMIT_SYMBOL, 163),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 163),
+ (1, HUFFMAN_EMIT_SYMBOL, 164),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 164),
+ (1, HUFFMAN_EMIT_SYMBOL, 169),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 169),
+ (1, HUFFMAN_EMIT_SYMBOL, 170),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 170),
+ (1, HUFFMAN_EMIT_SYMBOL, 173),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 173),
+
+ # Node 127
+ (2, HUFFMAN_EMIT_SYMBOL, 154),
+ (9, HUFFMAN_EMIT_SYMBOL, 154),
+ (23, HUFFMAN_EMIT_SYMBOL, 154),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 154),
+ (2, HUFFMAN_EMIT_SYMBOL, 156),
+ (9, HUFFMAN_EMIT_SYMBOL, 156),
+ (23, HUFFMAN_EMIT_SYMBOL, 156),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 156),
+ (2, HUFFMAN_EMIT_SYMBOL, 160),
+ (9, HUFFMAN_EMIT_SYMBOL, 160),
+ (23, HUFFMAN_EMIT_SYMBOL, 160),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 160),
+ (2, HUFFMAN_EMIT_SYMBOL, 163),
+ (9, HUFFMAN_EMIT_SYMBOL, 163),
+ (23, HUFFMAN_EMIT_SYMBOL, 163),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 163),
+
+ # Node 128
+ (3, HUFFMAN_EMIT_SYMBOL, 154),
+ (6, HUFFMAN_EMIT_SYMBOL, 154),
+ (10, HUFFMAN_EMIT_SYMBOL, 154),
+ (15, HUFFMAN_EMIT_SYMBOL, 154),
+ (24, HUFFMAN_EMIT_SYMBOL, 154),
+ (31, HUFFMAN_EMIT_SYMBOL, 154),
+ (41, HUFFMAN_EMIT_SYMBOL, 154),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 154),
+ (3, HUFFMAN_EMIT_SYMBOL, 156),
+ (6, HUFFMAN_EMIT_SYMBOL, 156),
+ (10, HUFFMAN_EMIT_SYMBOL, 156),
+ (15, HUFFMAN_EMIT_SYMBOL, 156),
+ (24, HUFFMAN_EMIT_SYMBOL, 156),
+ (31, HUFFMAN_EMIT_SYMBOL, 156),
+ (41, HUFFMAN_EMIT_SYMBOL, 156),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 156),
+
+ # Node 129
+ (3, HUFFMAN_EMIT_SYMBOL, 160),
+ (6, HUFFMAN_EMIT_SYMBOL, 160),
+ (10, HUFFMAN_EMIT_SYMBOL, 160),
+ (15, HUFFMAN_EMIT_SYMBOL, 160),
+ (24, HUFFMAN_EMIT_SYMBOL, 160),
+ (31, HUFFMAN_EMIT_SYMBOL, 160),
+ (41, HUFFMAN_EMIT_SYMBOL, 160),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 160),
+ (3, HUFFMAN_EMIT_SYMBOL, 163),
+ (6, HUFFMAN_EMIT_SYMBOL, 163),
+ (10, HUFFMAN_EMIT_SYMBOL, 163),
+ (15, HUFFMAN_EMIT_SYMBOL, 163),
+ (24, HUFFMAN_EMIT_SYMBOL, 163),
+ (31, HUFFMAN_EMIT_SYMBOL, 163),
+ (41, HUFFMAN_EMIT_SYMBOL, 163),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 163),
+
+ # Node 130
+ (2, HUFFMAN_EMIT_SYMBOL, 164),
+ (9, HUFFMAN_EMIT_SYMBOL, 164),
+ (23, HUFFMAN_EMIT_SYMBOL, 164),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 164),
+ (2, HUFFMAN_EMIT_SYMBOL, 169),
+ (9, HUFFMAN_EMIT_SYMBOL, 169),
+ (23, HUFFMAN_EMIT_SYMBOL, 169),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 169),
+ (2, HUFFMAN_EMIT_SYMBOL, 170),
+ (9, HUFFMAN_EMIT_SYMBOL, 170),
+ (23, HUFFMAN_EMIT_SYMBOL, 170),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 170),
+ (2, HUFFMAN_EMIT_SYMBOL, 173),
+ (9, HUFFMAN_EMIT_SYMBOL, 173),
+ (23, HUFFMAN_EMIT_SYMBOL, 173),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 173),
+
+ # Node 131
+ (3, HUFFMAN_EMIT_SYMBOL, 164),
+ (6, HUFFMAN_EMIT_SYMBOL, 164),
+ (10, HUFFMAN_EMIT_SYMBOL, 164),
+ (15, HUFFMAN_EMIT_SYMBOL, 164),
+ (24, HUFFMAN_EMIT_SYMBOL, 164),
+ (31, HUFFMAN_EMIT_SYMBOL, 164),
+ (41, HUFFMAN_EMIT_SYMBOL, 164),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 164),
+ (3, HUFFMAN_EMIT_SYMBOL, 169),
+ (6, HUFFMAN_EMIT_SYMBOL, 169),
+ (10, HUFFMAN_EMIT_SYMBOL, 169),
+ (15, HUFFMAN_EMIT_SYMBOL, 169),
+ (24, HUFFMAN_EMIT_SYMBOL, 169),
+ (31, HUFFMAN_EMIT_SYMBOL, 169),
+ (41, HUFFMAN_EMIT_SYMBOL, 169),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 169),
+
+ # Node 132
+ (3, HUFFMAN_EMIT_SYMBOL, 170),
+ (6, HUFFMAN_EMIT_SYMBOL, 170),
+ (10, HUFFMAN_EMIT_SYMBOL, 170),
+ (15, HUFFMAN_EMIT_SYMBOL, 170),
+ (24, HUFFMAN_EMIT_SYMBOL, 170),
+ (31, HUFFMAN_EMIT_SYMBOL, 170),
+ (41, HUFFMAN_EMIT_SYMBOL, 170),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 170),
+ (3, HUFFMAN_EMIT_SYMBOL, 173),
+ (6, HUFFMAN_EMIT_SYMBOL, 173),
+ (10, HUFFMAN_EMIT_SYMBOL, 173),
+ (15, HUFFMAN_EMIT_SYMBOL, 173),
+ (24, HUFFMAN_EMIT_SYMBOL, 173),
+ (31, HUFFMAN_EMIT_SYMBOL, 173),
+ (41, HUFFMAN_EMIT_SYMBOL, 173),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 173),
+
+ # Node 133
+ (137, 0, 0),
+ (138, 0, 0),
+ (140, 0, 0),
+ (141, 0, 0),
+ (144, 0, 0),
+ (145, 0, 0),
+ (147, 0, 0),
+ (150, 0, 0),
+ (156, 0, 0),
+ (159, 0, 0),
+ (163, 0, 0),
+ (166, 0, 0),
+ (171, 0, 0),
+ (174, 0, 0),
+ (181, 0, 0),
+ (190, 0, 0),
+
+ # Node 134
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 178),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 181),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 185),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 186),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 187),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 189),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 190),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 196),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 198),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 228),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 232),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 233),
+ (148, 0, 0),
+ (149, 0, 0),
+ (151, 0, 0),
+ (152, 0, 0),
+
+ # Node 135
+ (1, HUFFMAN_EMIT_SYMBOL, 178),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 178),
+ (1, HUFFMAN_EMIT_SYMBOL, 181),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 181),
+ (1, HUFFMAN_EMIT_SYMBOL, 185),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 185),
+ (1, HUFFMAN_EMIT_SYMBOL, 186),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 186),
+ (1, HUFFMAN_EMIT_SYMBOL, 187),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 187),
+ (1, HUFFMAN_EMIT_SYMBOL, 189),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 189),
+ (1, HUFFMAN_EMIT_SYMBOL, 190),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 190),
+ (1, HUFFMAN_EMIT_SYMBOL, 196),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 196),
+
+ # Node 136
+ (2, HUFFMAN_EMIT_SYMBOL, 178),
+ (9, HUFFMAN_EMIT_SYMBOL, 178),
+ (23, HUFFMAN_EMIT_SYMBOL, 178),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 178),
+ (2, HUFFMAN_EMIT_SYMBOL, 181),
+ (9, HUFFMAN_EMIT_SYMBOL, 181),
+ (23, HUFFMAN_EMIT_SYMBOL, 181),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 181),
+ (2, HUFFMAN_EMIT_SYMBOL, 185),
+ (9, HUFFMAN_EMIT_SYMBOL, 185),
+ (23, HUFFMAN_EMIT_SYMBOL, 185),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 185),
+ (2, HUFFMAN_EMIT_SYMBOL, 186),
+ (9, HUFFMAN_EMIT_SYMBOL, 186),
+ (23, HUFFMAN_EMIT_SYMBOL, 186),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 186),
+
+ # Node 137
+ (3, HUFFMAN_EMIT_SYMBOL, 178),
+ (6, HUFFMAN_EMIT_SYMBOL, 178),
+ (10, HUFFMAN_EMIT_SYMBOL, 178),
+ (15, HUFFMAN_EMIT_SYMBOL, 178),
+ (24, HUFFMAN_EMIT_SYMBOL, 178),
+ (31, HUFFMAN_EMIT_SYMBOL, 178),
+ (41, HUFFMAN_EMIT_SYMBOL, 178),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 178),
+ (3, HUFFMAN_EMIT_SYMBOL, 181),
+ (6, HUFFMAN_EMIT_SYMBOL, 181),
+ (10, HUFFMAN_EMIT_SYMBOL, 181),
+ (15, HUFFMAN_EMIT_SYMBOL, 181),
+ (24, HUFFMAN_EMIT_SYMBOL, 181),
+ (31, HUFFMAN_EMIT_SYMBOL, 181),
+ (41, HUFFMAN_EMIT_SYMBOL, 181),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 181),
+
+ # Node 138
+ (3, HUFFMAN_EMIT_SYMBOL, 185),
+ (6, HUFFMAN_EMIT_SYMBOL, 185),
+ (10, HUFFMAN_EMIT_SYMBOL, 185),
+ (15, HUFFMAN_EMIT_SYMBOL, 185),
+ (24, HUFFMAN_EMIT_SYMBOL, 185),
+ (31, HUFFMAN_EMIT_SYMBOL, 185),
+ (41, HUFFMAN_EMIT_SYMBOL, 185),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 185),
+ (3, HUFFMAN_EMIT_SYMBOL, 186),
+ (6, HUFFMAN_EMIT_SYMBOL, 186),
+ (10, HUFFMAN_EMIT_SYMBOL, 186),
+ (15, HUFFMAN_EMIT_SYMBOL, 186),
+ (24, HUFFMAN_EMIT_SYMBOL, 186),
+ (31, HUFFMAN_EMIT_SYMBOL, 186),
+ (41, HUFFMAN_EMIT_SYMBOL, 186),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 186),
+
+ # Node 139
+ (2, HUFFMAN_EMIT_SYMBOL, 187),
+ (9, HUFFMAN_EMIT_SYMBOL, 187),
+ (23, HUFFMAN_EMIT_SYMBOL, 187),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 187),
+ (2, HUFFMAN_EMIT_SYMBOL, 189),
+ (9, HUFFMAN_EMIT_SYMBOL, 189),
+ (23, HUFFMAN_EMIT_SYMBOL, 189),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 189),
+ (2, HUFFMAN_EMIT_SYMBOL, 190),
+ (9, HUFFMAN_EMIT_SYMBOL, 190),
+ (23, HUFFMAN_EMIT_SYMBOL, 190),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 190),
+ (2, HUFFMAN_EMIT_SYMBOL, 196),
+ (9, HUFFMAN_EMIT_SYMBOL, 196),
+ (23, HUFFMAN_EMIT_SYMBOL, 196),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 196),
+
+ # Node 140
+ (3, HUFFMAN_EMIT_SYMBOL, 187),
+ (6, HUFFMAN_EMIT_SYMBOL, 187),
+ (10, HUFFMAN_EMIT_SYMBOL, 187),
+ (15, HUFFMAN_EMIT_SYMBOL, 187),
+ (24, HUFFMAN_EMIT_SYMBOL, 187),
+ (31, HUFFMAN_EMIT_SYMBOL, 187),
+ (41, HUFFMAN_EMIT_SYMBOL, 187),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 187),
+ (3, HUFFMAN_EMIT_SYMBOL, 189),
+ (6, HUFFMAN_EMIT_SYMBOL, 189),
+ (10, HUFFMAN_EMIT_SYMBOL, 189),
+ (15, HUFFMAN_EMIT_SYMBOL, 189),
+ (24, HUFFMAN_EMIT_SYMBOL, 189),
+ (31, HUFFMAN_EMIT_SYMBOL, 189),
+ (41, HUFFMAN_EMIT_SYMBOL, 189),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 189),
+
+ # Node 141
+ (3, HUFFMAN_EMIT_SYMBOL, 190),
+ (6, HUFFMAN_EMIT_SYMBOL, 190),
+ (10, HUFFMAN_EMIT_SYMBOL, 190),
+ (15, HUFFMAN_EMIT_SYMBOL, 190),
+ (24, HUFFMAN_EMIT_SYMBOL, 190),
+ (31, HUFFMAN_EMIT_SYMBOL, 190),
+ (41, HUFFMAN_EMIT_SYMBOL, 190),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 190),
+ (3, HUFFMAN_EMIT_SYMBOL, 196),
+ (6, HUFFMAN_EMIT_SYMBOL, 196),
+ (10, HUFFMAN_EMIT_SYMBOL, 196),
+ (15, HUFFMAN_EMIT_SYMBOL, 196),
+ (24, HUFFMAN_EMIT_SYMBOL, 196),
+ (31, HUFFMAN_EMIT_SYMBOL, 196),
+ (41, HUFFMAN_EMIT_SYMBOL, 196),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 196),
+
+ # Node 142
+ (1, HUFFMAN_EMIT_SYMBOL, 198),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 198),
+ (1, HUFFMAN_EMIT_SYMBOL, 228),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 228),
+ (1, HUFFMAN_EMIT_SYMBOL, 232),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 232),
+ (1, HUFFMAN_EMIT_SYMBOL, 233),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 233),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 1),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 135),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 137),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 138),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 139),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 140),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 141),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 143),
+
+ # Node 143
+ (2, HUFFMAN_EMIT_SYMBOL, 198),
+ (9, HUFFMAN_EMIT_SYMBOL, 198),
+ (23, HUFFMAN_EMIT_SYMBOL, 198),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 198),
+ (2, HUFFMAN_EMIT_SYMBOL, 228),
+ (9, HUFFMAN_EMIT_SYMBOL, 228),
+ (23, HUFFMAN_EMIT_SYMBOL, 228),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 228),
+ (2, HUFFMAN_EMIT_SYMBOL, 232),
+ (9, HUFFMAN_EMIT_SYMBOL, 232),
+ (23, HUFFMAN_EMIT_SYMBOL, 232),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 232),
+ (2, HUFFMAN_EMIT_SYMBOL, 233),
+ (9, HUFFMAN_EMIT_SYMBOL, 233),
+ (23, HUFFMAN_EMIT_SYMBOL, 233),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 233),
+
+ # Node 144
+ (3, HUFFMAN_EMIT_SYMBOL, 198),
+ (6, HUFFMAN_EMIT_SYMBOL, 198),
+ (10, HUFFMAN_EMIT_SYMBOL, 198),
+ (15, HUFFMAN_EMIT_SYMBOL, 198),
+ (24, HUFFMAN_EMIT_SYMBOL, 198),
+ (31, HUFFMAN_EMIT_SYMBOL, 198),
+ (41, HUFFMAN_EMIT_SYMBOL, 198),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 198),
+ (3, HUFFMAN_EMIT_SYMBOL, 228),
+ (6, HUFFMAN_EMIT_SYMBOL, 228),
+ (10, HUFFMAN_EMIT_SYMBOL, 228),
+ (15, HUFFMAN_EMIT_SYMBOL, 228),
+ (24, HUFFMAN_EMIT_SYMBOL, 228),
+ (31, HUFFMAN_EMIT_SYMBOL, 228),
+ (41, HUFFMAN_EMIT_SYMBOL, 228),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 228),
+
+ # Node 145
+ (3, HUFFMAN_EMIT_SYMBOL, 232),
+ (6, HUFFMAN_EMIT_SYMBOL, 232),
+ (10, HUFFMAN_EMIT_SYMBOL, 232),
+ (15, HUFFMAN_EMIT_SYMBOL, 232),
+ (24, HUFFMAN_EMIT_SYMBOL, 232),
+ (31, HUFFMAN_EMIT_SYMBOL, 232),
+ (41, HUFFMAN_EMIT_SYMBOL, 232),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 232),
+ (3, HUFFMAN_EMIT_SYMBOL, 233),
+ (6, HUFFMAN_EMIT_SYMBOL, 233),
+ (10, HUFFMAN_EMIT_SYMBOL, 233),
+ (15, HUFFMAN_EMIT_SYMBOL, 233),
+ (24, HUFFMAN_EMIT_SYMBOL, 233),
+ (31, HUFFMAN_EMIT_SYMBOL, 233),
+ (41, HUFFMAN_EMIT_SYMBOL, 233),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 233),
+
+ # Node 146
+ (1, HUFFMAN_EMIT_SYMBOL, 1),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 1),
+ (1, HUFFMAN_EMIT_SYMBOL, 135),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 135),
+ (1, HUFFMAN_EMIT_SYMBOL, 137),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 137),
+ (1, HUFFMAN_EMIT_SYMBOL, 138),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 138),
+ (1, HUFFMAN_EMIT_SYMBOL, 139),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 139),
+ (1, HUFFMAN_EMIT_SYMBOL, 140),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 140),
+ (1, HUFFMAN_EMIT_SYMBOL, 141),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 141),
+ (1, HUFFMAN_EMIT_SYMBOL, 143),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 143),
+
+ # Node 147
+ (2, HUFFMAN_EMIT_SYMBOL, 1),
+ (9, HUFFMAN_EMIT_SYMBOL, 1),
+ (23, HUFFMAN_EMIT_SYMBOL, 1),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 1),
+ (2, HUFFMAN_EMIT_SYMBOL, 135),
+ (9, HUFFMAN_EMIT_SYMBOL, 135),
+ (23, HUFFMAN_EMIT_SYMBOL, 135),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 135),
+ (2, HUFFMAN_EMIT_SYMBOL, 137),
+ (9, HUFFMAN_EMIT_SYMBOL, 137),
+ (23, HUFFMAN_EMIT_SYMBOL, 137),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 137),
+ (2, HUFFMAN_EMIT_SYMBOL, 138),
+ (9, HUFFMAN_EMIT_SYMBOL, 138),
+ (23, HUFFMAN_EMIT_SYMBOL, 138),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 138),
+
+ # Node 148
+ (3, HUFFMAN_EMIT_SYMBOL, 1),
+ (6, HUFFMAN_EMIT_SYMBOL, 1),
+ (10, HUFFMAN_EMIT_SYMBOL, 1),
+ (15, HUFFMAN_EMIT_SYMBOL, 1),
+ (24, HUFFMAN_EMIT_SYMBOL, 1),
+ (31, HUFFMAN_EMIT_SYMBOL, 1),
+ (41, HUFFMAN_EMIT_SYMBOL, 1),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 1),
+ (3, HUFFMAN_EMIT_SYMBOL, 135),
+ (6, HUFFMAN_EMIT_SYMBOL, 135),
+ (10, HUFFMAN_EMIT_SYMBOL, 135),
+ (15, HUFFMAN_EMIT_SYMBOL, 135),
+ (24, HUFFMAN_EMIT_SYMBOL, 135),
+ (31, HUFFMAN_EMIT_SYMBOL, 135),
+ (41, HUFFMAN_EMIT_SYMBOL, 135),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 135),
+
+ # Node 149
+ (3, HUFFMAN_EMIT_SYMBOL, 137),
+ (6, HUFFMAN_EMIT_SYMBOL, 137),
+ (10, HUFFMAN_EMIT_SYMBOL, 137),
+ (15, HUFFMAN_EMIT_SYMBOL, 137),
+ (24, HUFFMAN_EMIT_SYMBOL, 137),
+ (31, HUFFMAN_EMIT_SYMBOL, 137),
+ (41, HUFFMAN_EMIT_SYMBOL, 137),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 137),
+ (3, HUFFMAN_EMIT_SYMBOL, 138),
+ (6, HUFFMAN_EMIT_SYMBOL, 138),
+ (10, HUFFMAN_EMIT_SYMBOL, 138),
+ (15, HUFFMAN_EMIT_SYMBOL, 138),
+ (24, HUFFMAN_EMIT_SYMBOL, 138),
+ (31, HUFFMAN_EMIT_SYMBOL, 138),
+ (41, HUFFMAN_EMIT_SYMBOL, 138),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 138),
+
+ # Node 150
+ (2, HUFFMAN_EMIT_SYMBOL, 139),
+ (9, HUFFMAN_EMIT_SYMBOL, 139),
+ (23, HUFFMAN_EMIT_SYMBOL, 139),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 139),
+ (2, HUFFMAN_EMIT_SYMBOL, 140),
+ (9, HUFFMAN_EMIT_SYMBOL, 140),
+ (23, HUFFMAN_EMIT_SYMBOL, 140),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 140),
+ (2, HUFFMAN_EMIT_SYMBOL, 141),
+ (9, HUFFMAN_EMIT_SYMBOL, 141),
+ (23, HUFFMAN_EMIT_SYMBOL, 141),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 141),
+ (2, HUFFMAN_EMIT_SYMBOL, 143),
+ (9, HUFFMAN_EMIT_SYMBOL, 143),
+ (23, HUFFMAN_EMIT_SYMBOL, 143),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 143),
+
+ # Node 151
+ (3, HUFFMAN_EMIT_SYMBOL, 139),
+ (6, HUFFMAN_EMIT_SYMBOL, 139),
+ (10, HUFFMAN_EMIT_SYMBOL, 139),
+ (15, HUFFMAN_EMIT_SYMBOL, 139),
+ (24, HUFFMAN_EMIT_SYMBOL, 139),
+ (31, HUFFMAN_EMIT_SYMBOL, 139),
+ (41, HUFFMAN_EMIT_SYMBOL, 139),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 139),
+ (3, HUFFMAN_EMIT_SYMBOL, 140),
+ (6, HUFFMAN_EMIT_SYMBOL, 140),
+ (10, HUFFMAN_EMIT_SYMBOL, 140),
+ (15, HUFFMAN_EMIT_SYMBOL, 140),
+ (24, HUFFMAN_EMIT_SYMBOL, 140),
+ (31, HUFFMAN_EMIT_SYMBOL, 140),
+ (41, HUFFMAN_EMIT_SYMBOL, 140),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 140),
+
+ # Node 152
+ (3, HUFFMAN_EMIT_SYMBOL, 141),
+ (6, HUFFMAN_EMIT_SYMBOL, 141),
+ (10, HUFFMAN_EMIT_SYMBOL, 141),
+ (15, HUFFMAN_EMIT_SYMBOL, 141),
+ (24, HUFFMAN_EMIT_SYMBOL, 141),
+ (31, HUFFMAN_EMIT_SYMBOL, 141),
+ (41, HUFFMAN_EMIT_SYMBOL, 141),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 141),
+ (3, HUFFMAN_EMIT_SYMBOL, 143),
+ (6, HUFFMAN_EMIT_SYMBOL, 143),
+ (10, HUFFMAN_EMIT_SYMBOL, 143),
+ (15, HUFFMAN_EMIT_SYMBOL, 143),
+ (24, HUFFMAN_EMIT_SYMBOL, 143),
+ (31, HUFFMAN_EMIT_SYMBOL, 143),
+ (41, HUFFMAN_EMIT_SYMBOL, 143),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 143),
+
+ # Node 153
+ (157, 0, 0),
+ (158, 0, 0),
+ (160, 0, 0),
+ (161, 0, 0),
+ (164, 0, 0),
+ (165, 0, 0),
+ (167, 0, 0),
+ (168, 0, 0),
+ (172, 0, 0),
+ (173, 0, 0),
+ (175, 0, 0),
+ (177, 0, 0),
+ (182, 0, 0),
+ (185, 0, 0),
+ (191, 0, 0),
+ (207, 0, 0),
+
+ # Node 154
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 147),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 149),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 150),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 151),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 152),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 155),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 157),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 158),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 165),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 166),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 168),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 174),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 175),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 180),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 182),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 183),
+
+ # Node 155
+ (1, HUFFMAN_EMIT_SYMBOL, 147),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 147),
+ (1, HUFFMAN_EMIT_SYMBOL, 149),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 149),
+ (1, HUFFMAN_EMIT_SYMBOL, 150),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 150),
+ (1, HUFFMAN_EMIT_SYMBOL, 151),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 151),
+ (1, HUFFMAN_EMIT_SYMBOL, 152),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 152),
+ (1, HUFFMAN_EMIT_SYMBOL, 155),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 155),
+ (1, HUFFMAN_EMIT_SYMBOL, 157),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 157),
+ (1, HUFFMAN_EMIT_SYMBOL, 158),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 158),
+
+ # Node 156
+ (2, HUFFMAN_EMIT_SYMBOL, 147),
+ (9, HUFFMAN_EMIT_SYMBOL, 147),
+ (23, HUFFMAN_EMIT_SYMBOL, 147),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 147),
+ (2, HUFFMAN_EMIT_SYMBOL, 149),
+ (9, HUFFMAN_EMIT_SYMBOL, 149),
+ (23, HUFFMAN_EMIT_SYMBOL, 149),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 149),
+ (2, HUFFMAN_EMIT_SYMBOL, 150),
+ (9, HUFFMAN_EMIT_SYMBOL, 150),
+ (23, HUFFMAN_EMIT_SYMBOL, 150),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 150),
+ (2, HUFFMAN_EMIT_SYMBOL, 151),
+ (9, HUFFMAN_EMIT_SYMBOL, 151),
+ (23, HUFFMAN_EMIT_SYMBOL, 151),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 151),
+
+ # Node 157
+ (3, HUFFMAN_EMIT_SYMBOL, 147),
+ (6, HUFFMAN_EMIT_SYMBOL, 147),
+ (10, HUFFMAN_EMIT_SYMBOL, 147),
+ (15, HUFFMAN_EMIT_SYMBOL, 147),
+ (24, HUFFMAN_EMIT_SYMBOL, 147),
+ (31, HUFFMAN_EMIT_SYMBOL, 147),
+ (41, HUFFMAN_EMIT_SYMBOL, 147),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 147),
+ (3, HUFFMAN_EMIT_SYMBOL, 149),
+ (6, HUFFMAN_EMIT_SYMBOL, 149),
+ (10, HUFFMAN_EMIT_SYMBOL, 149),
+ (15, HUFFMAN_EMIT_SYMBOL, 149),
+ (24, HUFFMAN_EMIT_SYMBOL, 149),
+ (31, HUFFMAN_EMIT_SYMBOL, 149),
+ (41, HUFFMAN_EMIT_SYMBOL, 149),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 149),
+
+ # Node 158
+ (3, HUFFMAN_EMIT_SYMBOL, 150),
+ (6, HUFFMAN_EMIT_SYMBOL, 150),
+ (10, HUFFMAN_EMIT_SYMBOL, 150),
+ (15, HUFFMAN_EMIT_SYMBOL, 150),
+ (24, HUFFMAN_EMIT_SYMBOL, 150),
+ (31, HUFFMAN_EMIT_SYMBOL, 150),
+ (41, HUFFMAN_EMIT_SYMBOL, 150),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 150),
+ (3, HUFFMAN_EMIT_SYMBOL, 151),
+ (6, HUFFMAN_EMIT_SYMBOL, 151),
+ (10, HUFFMAN_EMIT_SYMBOL, 151),
+ (15, HUFFMAN_EMIT_SYMBOL, 151),
+ (24, HUFFMAN_EMIT_SYMBOL, 151),
+ (31, HUFFMAN_EMIT_SYMBOL, 151),
+ (41, HUFFMAN_EMIT_SYMBOL, 151),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 151),
+
+ # Node 159
+ (2, HUFFMAN_EMIT_SYMBOL, 152),
+ (9, HUFFMAN_EMIT_SYMBOL, 152),
+ (23, HUFFMAN_EMIT_SYMBOL, 152),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 152),
+ (2, HUFFMAN_EMIT_SYMBOL, 155),
+ (9, HUFFMAN_EMIT_SYMBOL, 155),
+ (23, HUFFMAN_EMIT_SYMBOL, 155),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 155),
+ (2, HUFFMAN_EMIT_SYMBOL, 157),
+ (9, HUFFMAN_EMIT_SYMBOL, 157),
+ (23, HUFFMAN_EMIT_SYMBOL, 157),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 157),
+ (2, HUFFMAN_EMIT_SYMBOL, 158),
+ (9, HUFFMAN_EMIT_SYMBOL, 158),
+ (23, HUFFMAN_EMIT_SYMBOL, 158),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 158),
+
+ # Node 160
+ (3, HUFFMAN_EMIT_SYMBOL, 152),
+ (6, HUFFMAN_EMIT_SYMBOL, 152),
+ (10, HUFFMAN_EMIT_SYMBOL, 152),
+ (15, HUFFMAN_EMIT_SYMBOL, 152),
+ (24, HUFFMAN_EMIT_SYMBOL, 152),
+ (31, HUFFMAN_EMIT_SYMBOL, 152),
+ (41, HUFFMAN_EMIT_SYMBOL, 152),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 152),
+ (3, HUFFMAN_EMIT_SYMBOL, 155),
+ (6, HUFFMAN_EMIT_SYMBOL, 155),
+ (10, HUFFMAN_EMIT_SYMBOL, 155),
+ (15, HUFFMAN_EMIT_SYMBOL, 155),
+ (24, HUFFMAN_EMIT_SYMBOL, 155),
+ (31, HUFFMAN_EMIT_SYMBOL, 155),
+ (41, HUFFMAN_EMIT_SYMBOL, 155),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 155),
+
+ # Node 161
+ (3, HUFFMAN_EMIT_SYMBOL, 157),
+ (6, HUFFMAN_EMIT_SYMBOL, 157),
+ (10, HUFFMAN_EMIT_SYMBOL, 157),
+ (15, HUFFMAN_EMIT_SYMBOL, 157),
+ (24, HUFFMAN_EMIT_SYMBOL, 157),
+ (31, HUFFMAN_EMIT_SYMBOL, 157),
+ (41, HUFFMAN_EMIT_SYMBOL, 157),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 157),
+ (3, HUFFMAN_EMIT_SYMBOL, 158),
+ (6, HUFFMAN_EMIT_SYMBOL, 158),
+ (10, HUFFMAN_EMIT_SYMBOL, 158),
+ (15, HUFFMAN_EMIT_SYMBOL, 158),
+ (24, HUFFMAN_EMIT_SYMBOL, 158),
+ (31, HUFFMAN_EMIT_SYMBOL, 158),
+ (41, HUFFMAN_EMIT_SYMBOL, 158),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 158),
+
+ # Node 162
+ (1, HUFFMAN_EMIT_SYMBOL, 165),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 165),
+ (1, HUFFMAN_EMIT_SYMBOL, 166),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 166),
+ (1, HUFFMAN_EMIT_SYMBOL, 168),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 168),
+ (1, HUFFMAN_EMIT_SYMBOL, 174),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 174),
+ (1, HUFFMAN_EMIT_SYMBOL, 175),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 175),
+ (1, HUFFMAN_EMIT_SYMBOL, 180),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 180),
+ (1, HUFFMAN_EMIT_SYMBOL, 182),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 182),
+ (1, HUFFMAN_EMIT_SYMBOL, 183),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 183),
+
+ # Node 163
+ (2, HUFFMAN_EMIT_SYMBOL, 165),
+ (9, HUFFMAN_EMIT_SYMBOL, 165),
+ (23, HUFFMAN_EMIT_SYMBOL, 165),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 165),
+ (2, HUFFMAN_EMIT_SYMBOL, 166),
+ (9, HUFFMAN_EMIT_SYMBOL, 166),
+ (23, HUFFMAN_EMIT_SYMBOL, 166),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 166),
+ (2, HUFFMAN_EMIT_SYMBOL, 168),
+ (9, HUFFMAN_EMIT_SYMBOL, 168),
+ (23, HUFFMAN_EMIT_SYMBOL, 168),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 168),
+ (2, HUFFMAN_EMIT_SYMBOL, 174),
+ (9, HUFFMAN_EMIT_SYMBOL, 174),
+ (23, HUFFMAN_EMIT_SYMBOL, 174),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 174),
+
+ # Node 164
+ (3, HUFFMAN_EMIT_SYMBOL, 165),
+ (6, HUFFMAN_EMIT_SYMBOL, 165),
+ (10, HUFFMAN_EMIT_SYMBOL, 165),
+ (15, HUFFMAN_EMIT_SYMBOL, 165),
+ (24, HUFFMAN_EMIT_SYMBOL, 165),
+ (31, HUFFMAN_EMIT_SYMBOL, 165),
+ (41, HUFFMAN_EMIT_SYMBOL, 165),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 165),
+ (3, HUFFMAN_EMIT_SYMBOL, 166),
+ (6, HUFFMAN_EMIT_SYMBOL, 166),
+ (10, HUFFMAN_EMIT_SYMBOL, 166),
+ (15, HUFFMAN_EMIT_SYMBOL, 166),
+ (24, HUFFMAN_EMIT_SYMBOL, 166),
+ (31, HUFFMAN_EMIT_SYMBOL, 166),
+ (41, HUFFMAN_EMIT_SYMBOL, 166),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 166),
+
+ # Node 165
+ (3, HUFFMAN_EMIT_SYMBOL, 168),
+ (6, HUFFMAN_EMIT_SYMBOL, 168),
+ (10, HUFFMAN_EMIT_SYMBOL, 168),
+ (15, HUFFMAN_EMIT_SYMBOL, 168),
+ (24, HUFFMAN_EMIT_SYMBOL, 168),
+ (31, HUFFMAN_EMIT_SYMBOL, 168),
+ (41, HUFFMAN_EMIT_SYMBOL, 168),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 168),
+ (3, HUFFMAN_EMIT_SYMBOL, 174),
+ (6, HUFFMAN_EMIT_SYMBOL, 174),
+ (10, HUFFMAN_EMIT_SYMBOL, 174),
+ (15, HUFFMAN_EMIT_SYMBOL, 174),
+ (24, HUFFMAN_EMIT_SYMBOL, 174),
+ (31, HUFFMAN_EMIT_SYMBOL, 174),
+ (41, HUFFMAN_EMIT_SYMBOL, 174),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 174),
+
+ # Node 166
+ (2, HUFFMAN_EMIT_SYMBOL, 175),
+ (9, HUFFMAN_EMIT_SYMBOL, 175),
+ (23, HUFFMAN_EMIT_SYMBOL, 175),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 175),
+ (2, HUFFMAN_EMIT_SYMBOL, 180),
+ (9, HUFFMAN_EMIT_SYMBOL, 180),
+ (23, HUFFMAN_EMIT_SYMBOL, 180),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 180),
+ (2, HUFFMAN_EMIT_SYMBOL, 182),
+ (9, HUFFMAN_EMIT_SYMBOL, 182),
+ (23, HUFFMAN_EMIT_SYMBOL, 182),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 182),
+ (2, HUFFMAN_EMIT_SYMBOL, 183),
+ (9, HUFFMAN_EMIT_SYMBOL, 183),
+ (23, HUFFMAN_EMIT_SYMBOL, 183),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 183),
+
+ # Node 167
+ (3, HUFFMAN_EMIT_SYMBOL, 175),
+ (6, HUFFMAN_EMIT_SYMBOL, 175),
+ (10, HUFFMAN_EMIT_SYMBOL, 175),
+ (15, HUFFMAN_EMIT_SYMBOL, 175),
+ (24, HUFFMAN_EMIT_SYMBOL, 175),
+ (31, HUFFMAN_EMIT_SYMBOL, 175),
+ (41, HUFFMAN_EMIT_SYMBOL, 175),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 175),
+ (3, HUFFMAN_EMIT_SYMBOL, 180),
+ (6, HUFFMAN_EMIT_SYMBOL, 180),
+ (10, HUFFMAN_EMIT_SYMBOL, 180),
+ (15, HUFFMAN_EMIT_SYMBOL, 180),
+ (24, HUFFMAN_EMIT_SYMBOL, 180),
+ (31, HUFFMAN_EMIT_SYMBOL, 180),
+ (41, HUFFMAN_EMIT_SYMBOL, 180),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 180),
+
+ # Node 168
+ (3, HUFFMAN_EMIT_SYMBOL, 182),
+ (6, HUFFMAN_EMIT_SYMBOL, 182),
+ (10, HUFFMAN_EMIT_SYMBOL, 182),
+ (15, HUFFMAN_EMIT_SYMBOL, 182),
+ (24, HUFFMAN_EMIT_SYMBOL, 182),
+ (31, HUFFMAN_EMIT_SYMBOL, 182),
+ (41, HUFFMAN_EMIT_SYMBOL, 182),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 182),
+ (3, HUFFMAN_EMIT_SYMBOL, 183),
+ (6, HUFFMAN_EMIT_SYMBOL, 183),
+ (10, HUFFMAN_EMIT_SYMBOL, 183),
+ (15, HUFFMAN_EMIT_SYMBOL, 183),
+ (24, HUFFMAN_EMIT_SYMBOL, 183),
+ (31, HUFFMAN_EMIT_SYMBOL, 183),
+ (41, HUFFMAN_EMIT_SYMBOL, 183),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 183),
+
+ # Node 169
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 188),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 191),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 197),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 231),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 239),
+ (176, 0, 0),
+ (178, 0, 0),
+ (179, 0, 0),
+ (183, 0, 0),
+ (184, 0, 0),
+ (186, 0, 0),
+ (187, 0, 0),
+ (192, 0, 0),
+ (199, 0, 0),
+ (208, 0, 0),
+ (223, 0, 0),
+
+ # Node 170
+ (1, HUFFMAN_EMIT_SYMBOL, 188),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 188),
+ (1, HUFFMAN_EMIT_SYMBOL, 191),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 191),
+ (1, HUFFMAN_EMIT_SYMBOL, 197),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 197),
+ (1, HUFFMAN_EMIT_SYMBOL, 231),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 231),
+ (1, HUFFMAN_EMIT_SYMBOL, 239),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 239),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 9),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 142),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 144),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 145),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 148),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 159),
+
+ # Node 171
+ (2, HUFFMAN_EMIT_SYMBOL, 188),
+ (9, HUFFMAN_EMIT_SYMBOL, 188),
+ (23, HUFFMAN_EMIT_SYMBOL, 188),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 188),
+ (2, HUFFMAN_EMIT_SYMBOL, 191),
+ (9, HUFFMAN_EMIT_SYMBOL, 191),
+ (23, HUFFMAN_EMIT_SYMBOL, 191),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 191),
+ (2, HUFFMAN_EMIT_SYMBOL, 197),
+ (9, HUFFMAN_EMIT_SYMBOL, 197),
+ (23, HUFFMAN_EMIT_SYMBOL, 197),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 197),
+ (2, HUFFMAN_EMIT_SYMBOL, 231),
+ (9, HUFFMAN_EMIT_SYMBOL, 231),
+ (23, HUFFMAN_EMIT_SYMBOL, 231),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 231),
+
+ # Node 172
+ (3, HUFFMAN_EMIT_SYMBOL, 188),
+ (6, HUFFMAN_EMIT_SYMBOL, 188),
+ (10, HUFFMAN_EMIT_SYMBOL, 188),
+ (15, HUFFMAN_EMIT_SYMBOL, 188),
+ (24, HUFFMAN_EMIT_SYMBOL, 188),
+ (31, HUFFMAN_EMIT_SYMBOL, 188),
+ (41, HUFFMAN_EMIT_SYMBOL, 188),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 188),
+ (3, HUFFMAN_EMIT_SYMBOL, 191),
+ (6, HUFFMAN_EMIT_SYMBOL, 191),
+ (10, HUFFMAN_EMIT_SYMBOL, 191),
+ (15, HUFFMAN_EMIT_SYMBOL, 191),
+ (24, HUFFMAN_EMIT_SYMBOL, 191),
+ (31, HUFFMAN_EMIT_SYMBOL, 191),
+ (41, HUFFMAN_EMIT_SYMBOL, 191),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 191),
+
+ # Node 173
+ (3, HUFFMAN_EMIT_SYMBOL, 197),
+ (6, HUFFMAN_EMIT_SYMBOL, 197),
+ (10, HUFFMAN_EMIT_SYMBOL, 197),
+ (15, HUFFMAN_EMIT_SYMBOL, 197),
+ (24, HUFFMAN_EMIT_SYMBOL, 197),
+ (31, HUFFMAN_EMIT_SYMBOL, 197),
+ (41, HUFFMAN_EMIT_SYMBOL, 197),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 197),
+ (3, HUFFMAN_EMIT_SYMBOL, 231),
+ (6, HUFFMAN_EMIT_SYMBOL, 231),
+ (10, HUFFMAN_EMIT_SYMBOL, 231),
+ (15, HUFFMAN_EMIT_SYMBOL, 231),
+ (24, HUFFMAN_EMIT_SYMBOL, 231),
+ (31, HUFFMAN_EMIT_SYMBOL, 231),
+ (41, HUFFMAN_EMIT_SYMBOL, 231),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 231),
+
+ # Node 174
+ (2, HUFFMAN_EMIT_SYMBOL, 239),
+ (9, HUFFMAN_EMIT_SYMBOL, 239),
+ (23, HUFFMAN_EMIT_SYMBOL, 239),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 239),
+ (1, HUFFMAN_EMIT_SYMBOL, 9),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 9),
+ (1, HUFFMAN_EMIT_SYMBOL, 142),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 142),
+ (1, HUFFMAN_EMIT_SYMBOL, 144),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 144),
+ (1, HUFFMAN_EMIT_SYMBOL, 145),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 145),
+ (1, HUFFMAN_EMIT_SYMBOL, 148),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 148),
+ (1, HUFFMAN_EMIT_SYMBOL, 159),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 159),
+
+ # Node 175
+ (3, HUFFMAN_EMIT_SYMBOL, 239),
+ (6, HUFFMAN_EMIT_SYMBOL, 239),
+ (10, HUFFMAN_EMIT_SYMBOL, 239),
+ (15, HUFFMAN_EMIT_SYMBOL, 239),
+ (24, HUFFMAN_EMIT_SYMBOL, 239),
+ (31, HUFFMAN_EMIT_SYMBOL, 239),
+ (41, HUFFMAN_EMIT_SYMBOL, 239),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 239),
+ (2, HUFFMAN_EMIT_SYMBOL, 9),
+ (9, HUFFMAN_EMIT_SYMBOL, 9),
+ (23, HUFFMAN_EMIT_SYMBOL, 9),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 9),
+ (2, HUFFMAN_EMIT_SYMBOL, 142),
+ (9, HUFFMAN_EMIT_SYMBOL, 142),
+ (23, HUFFMAN_EMIT_SYMBOL, 142),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 142),
+
+ # Node 176
+ (3, HUFFMAN_EMIT_SYMBOL, 9),
+ (6, HUFFMAN_EMIT_SYMBOL, 9),
+ (10, HUFFMAN_EMIT_SYMBOL, 9),
+ (15, HUFFMAN_EMIT_SYMBOL, 9),
+ (24, HUFFMAN_EMIT_SYMBOL, 9),
+ (31, HUFFMAN_EMIT_SYMBOL, 9),
+ (41, HUFFMAN_EMIT_SYMBOL, 9),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 9),
+ (3, HUFFMAN_EMIT_SYMBOL, 142),
+ (6, HUFFMAN_EMIT_SYMBOL, 142),
+ (10, HUFFMAN_EMIT_SYMBOL, 142),
+ (15, HUFFMAN_EMIT_SYMBOL, 142),
+ (24, HUFFMAN_EMIT_SYMBOL, 142),
+ (31, HUFFMAN_EMIT_SYMBOL, 142),
+ (41, HUFFMAN_EMIT_SYMBOL, 142),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 142),
+
+ # Node 177
+ (2, HUFFMAN_EMIT_SYMBOL, 144),
+ (9, HUFFMAN_EMIT_SYMBOL, 144),
+ (23, HUFFMAN_EMIT_SYMBOL, 144),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 144),
+ (2, HUFFMAN_EMIT_SYMBOL, 145),
+ (9, HUFFMAN_EMIT_SYMBOL, 145),
+ (23, HUFFMAN_EMIT_SYMBOL, 145),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 145),
+ (2, HUFFMAN_EMIT_SYMBOL, 148),
+ (9, HUFFMAN_EMIT_SYMBOL, 148),
+ (23, HUFFMAN_EMIT_SYMBOL, 148),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 148),
+ (2, HUFFMAN_EMIT_SYMBOL, 159),
+ (9, HUFFMAN_EMIT_SYMBOL, 159),
+ (23, HUFFMAN_EMIT_SYMBOL, 159),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 159),
+
+ # Node 178
+ (3, HUFFMAN_EMIT_SYMBOL, 144),
+ (6, HUFFMAN_EMIT_SYMBOL, 144),
+ (10, HUFFMAN_EMIT_SYMBOL, 144),
+ (15, HUFFMAN_EMIT_SYMBOL, 144),
+ (24, HUFFMAN_EMIT_SYMBOL, 144),
+ (31, HUFFMAN_EMIT_SYMBOL, 144),
+ (41, HUFFMAN_EMIT_SYMBOL, 144),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 144),
+ (3, HUFFMAN_EMIT_SYMBOL, 145),
+ (6, HUFFMAN_EMIT_SYMBOL, 145),
+ (10, HUFFMAN_EMIT_SYMBOL, 145),
+ (15, HUFFMAN_EMIT_SYMBOL, 145),
+ (24, HUFFMAN_EMIT_SYMBOL, 145),
+ (31, HUFFMAN_EMIT_SYMBOL, 145),
+ (41, HUFFMAN_EMIT_SYMBOL, 145),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 145),
+
+ # Node 179
+ (3, HUFFMAN_EMIT_SYMBOL, 148),
+ (6, HUFFMAN_EMIT_SYMBOL, 148),
+ (10, HUFFMAN_EMIT_SYMBOL, 148),
+ (15, HUFFMAN_EMIT_SYMBOL, 148),
+ (24, HUFFMAN_EMIT_SYMBOL, 148),
+ (31, HUFFMAN_EMIT_SYMBOL, 148),
+ (41, HUFFMAN_EMIT_SYMBOL, 148),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 148),
+ (3, HUFFMAN_EMIT_SYMBOL, 159),
+ (6, HUFFMAN_EMIT_SYMBOL, 159),
+ (10, HUFFMAN_EMIT_SYMBOL, 159),
+ (15, HUFFMAN_EMIT_SYMBOL, 159),
+ (24, HUFFMAN_EMIT_SYMBOL, 159),
+ (31, HUFFMAN_EMIT_SYMBOL, 159),
+ (41, HUFFMAN_EMIT_SYMBOL, 159),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 159),
+
+ # Node 180
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 171),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 206),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 215),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 225),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 236),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 237),
+ (188, 0, 0),
+ (189, 0, 0),
+ (193, 0, 0),
+ (196, 0, 0),
+ (200, 0, 0),
+ (203, 0, 0),
+ (209, 0, 0),
+ (216, 0, 0),
+ (224, 0, 0),
+ (238, 0, 0),
+
+ # Node 181
+ (1, HUFFMAN_EMIT_SYMBOL, 171),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 171),
+ (1, HUFFMAN_EMIT_SYMBOL, 206),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 206),
+ (1, HUFFMAN_EMIT_SYMBOL, 215),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 215),
+ (1, HUFFMAN_EMIT_SYMBOL, 225),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 225),
+ (1, HUFFMAN_EMIT_SYMBOL, 236),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 236),
+ (1, HUFFMAN_EMIT_SYMBOL, 237),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 237),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 199),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 207),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 234),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 235),
+
+ # Node 182
+ (2, HUFFMAN_EMIT_SYMBOL, 171),
+ (9, HUFFMAN_EMIT_SYMBOL, 171),
+ (23, HUFFMAN_EMIT_SYMBOL, 171),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 171),
+ (2, HUFFMAN_EMIT_SYMBOL, 206),
+ (9, HUFFMAN_EMIT_SYMBOL, 206),
+ (23, HUFFMAN_EMIT_SYMBOL, 206),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 206),
+ (2, HUFFMAN_EMIT_SYMBOL, 215),
+ (9, HUFFMAN_EMIT_SYMBOL, 215),
+ (23, HUFFMAN_EMIT_SYMBOL, 215),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 215),
+ (2, HUFFMAN_EMIT_SYMBOL, 225),
+ (9, HUFFMAN_EMIT_SYMBOL, 225),
+ (23, HUFFMAN_EMIT_SYMBOL, 225),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 225),
+
+ # Node 183
+ (3, HUFFMAN_EMIT_SYMBOL, 171),
+ (6, HUFFMAN_EMIT_SYMBOL, 171),
+ (10, HUFFMAN_EMIT_SYMBOL, 171),
+ (15, HUFFMAN_EMIT_SYMBOL, 171),
+ (24, HUFFMAN_EMIT_SYMBOL, 171),
+ (31, HUFFMAN_EMIT_SYMBOL, 171),
+ (41, HUFFMAN_EMIT_SYMBOL, 171),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 171),
+ (3, HUFFMAN_EMIT_SYMBOL, 206),
+ (6, HUFFMAN_EMIT_SYMBOL, 206),
+ (10, HUFFMAN_EMIT_SYMBOL, 206),
+ (15, HUFFMAN_EMIT_SYMBOL, 206),
+ (24, HUFFMAN_EMIT_SYMBOL, 206),
+ (31, HUFFMAN_EMIT_SYMBOL, 206),
+ (41, HUFFMAN_EMIT_SYMBOL, 206),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 206),
+
+ # Node 184
+ (3, HUFFMAN_EMIT_SYMBOL, 215),
+ (6, HUFFMAN_EMIT_SYMBOL, 215),
+ (10, HUFFMAN_EMIT_SYMBOL, 215),
+ (15, HUFFMAN_EMIT_SYMBOL, 215),
+ (24, HUFFMAN_EMIT_SYMBOL, 215),
+ (31, HUFFMAN_EMIT_SYMBOL, 215),
+ (41, HUFFMAN_EMIT_SYMBOL, 215),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 215),
+ (3, HUFFMAN_EMIT_SYMBOL, 225),
+ (6, HUFFMAN_EMIT_SYMBOL, 225),
+ (10, HUFFMAN_EMIT_SYMBOL, 225),
+ (15, HUFFMAN_EMIT_SYMBOL, 225),
+ (24, HUFFMAN_EMIT_SYMBOL, 225),
+ (31, HUFFMAN_EMIT_SYMBOL, 225),
+ (41, HUFFMAN_EMIT_SYMBOL, 225),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 225),
+
+ # Node 185
+ (2, HUFFMAN_EMIT_SYMBOL, 236),
+ (9, HUFFMAN_EMIT_SYMBOL, 236),
+ (23, HUFFMAN_EMIT_SYMBOL, 236),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 236),
+ (2, HUFFMAN_EMIT_SYMBOL, 237),
+ (9, HUFFMAN_EMIT_SYMBOL, 237),
+ (23, HUFFMAN_EMIT_SYMBOL, 237),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 237),
+ (1, HUFFMAN_EMIT_SYMBOL, 199),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 199),
+ (1, HUFFMAN_EMIT_SYMBOL, 207),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 207),
+ (1, HUFFMAN_EMIT_SYMBOL, 234),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 234),
+ (1, HUFFMAN_EMIT_SYMBOL, 235),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 235),
+
+ # Node 186
+ (3, HUFFMAN_EMIT_SYMBOL, 236),
+ (6, HUFFMAN_EMIT_SYMBOL, 236),
+ (10, HUFFMAN_EMIT_SYMBOL, 236),
+ (15, HUFFMAN_EMIT_SYMBOL, 236),
+ (24, HUFFMAN_EMIT_SYMBOL, 236),
+ (31, HUFFMAN_EMIT_SYMBOL, 236),
+ (41, HUFFMAN_EMIT_SYMBOL, 236),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 236),
+ (3, HUFFMAN_EMIT_SYMBOL, 237),
+ (6, HUFFMAN_EMIT_SYMBOL, 237),
+ (10, HUFFMAN_EMIT_SYMBOL, 237),
+ (15, HUFFMAN_EMIT_SYMBOL, 237),
+ (24, HUFFMAN_EMIT_SYMBOL, 237),
+ (31, HUFFMAN_EMIT_SYMBOL, 237),
+ (41, HUFFMAN_EMIT_SYMBOL, 237),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 237),
+
+ # Node 187
+ (2, HUFFMAN_EMIT_SYMBOL, 199),
+ (9, HUFFMAN_EMIT_SYMBOL, 199),
+ (23, HUFFMAN_EMIT_SYMBOL, 199),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 199),
+ (2, HUFFMAN_EMIT_SYMBOL, 207),
+ (9, HUFFMAN_EMIT_SYMBOL, 207),
+ (23, HUFFMAN_EMIT_SYMBOL, 207),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 207),
+ (2, HUFFMAN_EMIT_SYMBOL, 234),
+ (9, HUFFMAN_EMIT_SYMBOL, 234),
+ (23, HUFFMAN_EMIT_SYMBOL, 234),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 234),
+ (2, HUFFMAN_EMIT_SYMBOL, 235),
+ (9, HUFFMAN_EMIT_SYMBOL, 235),
+ (23, HUFFMAN_EMIT_SYMBOL, 235),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 235),
+
+ # Node 188
+ (3, HUFFMAN_EMIT_SYMBOL, 199),
+ (6, HUFFMAN_EMIT_SYMBOL, 199),
+ (10, HUFFMAN_EMIT_SYMBOL, 199),
+ (15, HUFFMAN_EMIT_SYMBOL, 199),
+ (24, HUFFMAN_EMIT_SYMBOL, 199),
+ (31, HUFFMAN_EMIT_SYMBOL, 199),
+ (41, HUFFMAN_EMIT_SYMBOL, 199),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 199),
+ (3, HUFFMAN_EMIT_SYMBOL, 207),
+ (6, HUFFMAN_EMIT_SYMBOL, 207),
+ (10, HUFFMAN_EMIT_SYMBOL, 207),
+ (15, HUFFMAN_EMIT_SYMBOL, 207),
+ (24, HUFFMAN_EMIT_SYMBOL, 207),
+ (31, HUFFMAN_EMIT_SYMBOL, 207),
+ (41, HUFFMAN_EMIT_SYMBOL, 207),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 207),
+
+ # Node 189
+ (3, HUFFMAN_EMIT_SYMBOL, 234),
+ (6, HUFFMAN_EMIT_SYMBOL, 234),
+ (10, HUFFMAN_EMIT_SYMBOL, 234),
+ (15, HUFFMAN_EMIT_SYMBOL, 234),
+ (24, HUFFMAN_EMIT_SYMBOL, 234),
+ (31, HUFFMAN_EMIT_SYMBOL, 234),
+ (41, HUFFMAN_EMIT_SYMBOL, 234),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 234),
+ (3, HUFFMAN_EMIT_SYMBOL, 235),
+ (6, HUFFMAN_EMIT_SYMBOL, 235),
+ (10, HUFFMAN_EMIT_SYMBOL, 235),
+ (15, HUFFMAN_EMIT_SYMBOL, 235),
+ (24, HUFFMAN_EMIT_SYMBOL, 235),
+ (31, HUFFMAN_EMIT_SYMBOL, 235),
+ (41, HUFFMAN_EMIT_SYMBOL, 235),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 235),
+
+ # Node 190
+ (194, 0, 0),
+ (195, 0, 0),
+ (197, 0, 0),
+ (198, 0, 0),
+ (201, 0, 0),
+ (202, 0, 0),
+ (204, 0, 0),
+ (205, 0, 0),
+ (210, 0, 0),
+ (213, 0, 0),
+ (217, 0, 0),
+ (220, 0, 0),
+ (225, 0, 0),
+ (231, 0, 0),
+ (239, 0, 0),
+ (246, 0, 0),
+
+ # Node 191
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 192),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 193),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 200),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 201),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 202),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 205),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 210),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 213),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 218),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 219),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 238),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 240),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 242),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 243),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 255),
+ (206, 0, 0),
+
+ # Node 192
+ (1, HUFFMAN_EMIT_SYMBOL, 192),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 192),
+ (1, HUFFMAN_EMIT_SYMBOL, 193),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 193),
+ (1, HUFFMAN_EMIT_SYMBOL, 200),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 200),
+ (1, HUFFMAN_EMIT_SYMBOL, 201),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 201),
+ (1, HUFFMAN_EMIT_SYMBOL, 202),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 202),
+ (1, HUFFMAN_EMIT_SYMBOL, 205),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 205),
+ (1, HUFFMAN_EMIT_SYMBOL, 210),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 210),
+ (1, HUFFMAN_EMIT_SYMBOL, 213),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 213),
+
+ # Node 193
+ (2, HUFFMAN_EMIT_SYMBOL, 192),
+ (9, HUFFMAN_EMIT_SYMBOL, 192),
+ (23, HUFFMAN_EMIT_SYMBOL, 192),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 192),
+ (2, HUFFMAN_EMIT_SYMBOL, 193),
+ (9, HUFFMAN_EMIT_SYMBOL, 193),
+ (23, HUFFMAN_EMIT_SYMBOL, 193),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 193),
+ (2, HUFFMAN_EMIT_SYMBOL, 200),
+ (9, HUFFMAN_EMIT_SYMBOL, 200),
+ (23, HUFFMAN_EMIT_SYMBOL, 200),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 200),
+ (2, HUFFMAN_EMIT_SYMBOL, 201),
+ (9, HUFFMAN_EMIT_SYMBOL, 201),
+ (23, HUFFMAN_EMIT_SYMBOL, 201),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 201),
+
+ # Node 194
+ (3, HUFFMAN_EMIT_SYMBOL, 192),
+ (6, HUFFMAN_EMIT_SYMBOL, 192),
+ (10, HUFFMAN_EMIT_SYMBOL, 192),
+ (15, HUFFMAN_EMIT_SYMBOL, 192),
+ (24, HUFFMAN_EMIT_SYMBOL, 192),
+ (31, HUFFMAN_EMIT_SYMBOL, 192),
+ (41, HUFFMAN_EMIT_SYMBOL, 192),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 192),
+ (3, HUFFMAN_EMIT_SYMBOL, 193),
+ (6, HUFFMAN_EMIT_SYMBOL, 193),
+ (10, HUFFMAN_EMIT_SYMBOL, 193),
+ (15, HUFFMAN_EMIT_SYMBOL, 193),
+ (24, HUFFMAN_EMIT_SYMBOL, 193),
+ (31, HUFFMAN_EMIT_SYMBOL, 193),
+ (41, HUFFMAN_EMIT_SYMBOL, 193),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 193),
+
+ # Node 195
+ (3, HUFFMAN_EMIT_SYMBOL, 200),
+ (6, HUFFMAN_EMIT_SYMBOL, 200),
+ (10, HUFFMAN_EMIT_SYMBOL, 200),
+ (15, HUFFMAN_EMIT_SYMBOL, 200),
+ (24, HUFFMAN_EMIT_SYMBOL, 200),
+ (31, HUFFMAN_EMIT_SYMBOL, 200),
+ (41, HUFFMAN_EMIT_SYMBOL, 200),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 200),
+ (3, HUFFMAN_EMIT_SYMBOL, 201),
+ (6, HUFFMAN_EMIT_SYMBOL, 201),
+ (10, HUFFMAN_EMIT_SYMBOL, 201),
+ (15, HUFFMAN_EMIT_SYMBOL, 201),
+ (24, HUFFMAN_EMIT_SYMBOL, 201),
+ (31, HUFFMAN_EMIT_SYMBOL, 201),
+ (41, HUFFMAN_EMIT_SYMBOL, 201),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 201),
+
+ # Node 196
+ (2, HUFFMAN_EMIT_SYMBOL, 202),
+ (9, HUFFMAN_EMIT_SYMBOL, 202),
+ (23, HUFFMAN_EMIT_SYMBOL, 202),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 202),
+ (2, HUFFMAN_EMIT_SYMBOL, 205),
+ (9, HUFFMAN_EMIT_SYMBOL, 205),
+ (23, HUFFMAN_EMIT_SYMBOL, 205),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 205),
+ (2, HUFFMAN_EMIT_SYMBOL, 210),
+ (9, HUFFMAN_EMIT_SYMBOL, 210),
+ (23, HUFFMAN_EMIT_SYMBOL, 210),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 210),
+ (2, HUFFMAN_EMIT_SYMBOL, 213),
+ (9, HUFFMAN_EMIT_SYMBOL, 213),
+ (23, HUFFMAN_EMIT_SYMBOL, 213),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 213),
+
+ # Node 197
+ (3, HUFFMAN_EMIT_SYMBOL, 202),
+ (6, HUFFMAN_EMIT_SYMBOL, 202),
+ (10, HUFFMAN_EMIT_SYMBOL, 202),
+ (15, HUFFMAN_EMIT_SYMBOL, 202),
+ (24, HUFFMAN_EMIT_SYMBOL, 202),
+ (31, HUFFMAN_EMIT_SYMBOL, 202),
+ (41, HUFFMAN_EMIT_SYMBOL, 202),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 202),
+ (3, HUFFMAN_EMIT_SYMBOL, 205),
+ (6, HUFFMAN_EMIT_SYMBOL, 205),
+ (10, HUFFMAN_EMIT_SYMBOL, 205),
+ (15, HUFFMAN_EMIT_SYMBOL, 205),
+ (24, HUFFMAN_EMIT_SYMBOL, 205),
+ (31, HUFFMAN_EMIT_SYMBOL, 205),
+ (41, HUFFMAN_EMIT_SYMBOL, 205),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 205),
+
+ # Node 198
+ (3, HUFFMAN_EMIT_SYMBOL, 210),
+ (6, HUFFMAN_EMIT_SYMBOL, 210),
+ (10, HUFFMAN_EMIT_SYMBOL, 210),
+ (15, HUFFMAN_EMIT_SYMBOL, 210),
+ (24, HUFFMAN_EMIT_SYMBOL, 210),
+ (31, HUFFMAN_EMIT_SYMBOL, 210),
+ (41, HUFFMAN_EMIT_SYMBOL, 210),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 210),
+ (3, HUFFMAN_EMIT_SYMBOL, 213),
+ (6, HUFFMAN_EMIT_SYMBOL, 213),
+ (10, HUFFMAN_EMIT_SYMBOL, 213),
+ (15, HUFFMAN_EMIT_SYMBOL, 213),
+ (24, HUFFMAN_EMIT_SYMBOL, 213),
+ (31, HUFFMAN_EMIT_SYMBOL, 213),
+ (41, HUFFMAN_EMIT_SYMBOL, 213),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 213),
+
+ # Node 199
+ (1, HUFFMAN_EMIT_SYMBOL, 218),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 218),
+ (1, HUFFMAN_EMIT_SYMBOL, 219),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 219),
+ (1, HUFFMAN_EMIT_SYMBOL, 238),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 238),
+ (1, HUFFMAN_EMIT_SYMBOL, 240),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 240),
+ (1, HUFFMAN_EMIT_SYMBOL, 242),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 242),
+ (1, HUFFMAN_EMIT_SYMBOL, 243),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 243),
+ (1, HUFFMAN_EMIT_SYMBOL, 255),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 255),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 203),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 204),
+
+ # Node 200
+ (2, HUFFMAN_EMIT_SYMBOL, 218),
+ (9, HUFFMAN_EMIT_SYMBOL, 218),
+ (23, HUFFMAN_EMIT_SYMBOL, 218),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 218),
+ (2, HUFFMAN_EMIT_SYMBOL, 219),
+ (9, HUFFMAN_EMIT_SYMBOL, 219),
+ (23, HUFFMAN_EMIT_SYMBOL, 219),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 219),
+ (2, HUFFMAN_EMIT_SYMBOL, 238),
+ (9, HUFFMAN_EMIT_SYMBOL, 238),
+ (23, HUFFMAN_EMIT_SYMBOL, 238),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 238),
+ (2, HUFFMAN_EMIT_SYMBOL, 240),
+ (9, HUFFMAN_EMIT_SYMBOL, 240),
+ (23, HUFFMAN_EMIT_SYMBOL, 240),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 240),
+
+ # Node 201
+ (3, HUFFMAN_EMIT_SYMBOL, 218),
+ (6, HUFFMAN_EMIT_SYMBOL, 218),
+ (10, HUFFMAN_EMIT_SYMBOL, 218),
+ (15, HUFFMAN_EMIT_SYMBOL, 218),
+ (24, HUFFMAN_EMIT_SYMBOL, 218),
+ (31, HUFFMAN_EMIT_SYMBOL, 218),
+ (41, HUFFMAN_EMIT_SYMBOL, 218),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 218),
+ (3, HUFFMAN_EMIT_SYMBOL, 219),
+ (6, HUFFMAN_EMIT_SYMBOL, 219),
+ (10, HUFFMAN_EMIT_SYMBOL, 219),
+ (15, HUFFMAN_EMIT_SYMBOL, 219),
+ (24, HUFFMAN_EMIT_SYMBOL, 219),
+ (31, HUFFMAN_EMIT_SYMBOL, 219),
+ (41, HUFFMAN_EMIT_SYMBOL, 219),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 219),
+
+ # Node 202
+ (3, HUFFMAN_EMIT_SYMBOL, 238),
+ (6, HUFFMAN_EMIT_SYMBOL, 238),
+ (10, HUFFMAN_EMIT_SYMBOL, 238),
+ (15, HUFFMAN_EMIT_SYMBOL, 238),
+ (24, HUFFMAN_EMIT_SYMBOL, 238),
+ (31, HUFFMAN_EMIT_SYMBOL, 238),
+ (41, HUFFMAN_EMIT_SYMBOL, 238),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 238),
+ (3, HUFFMAN_EMIT_SYMBOL, 240),
+ (6, HUFFMAN_EMIT_SYMBOL, 240),
+ (10, HUFFMAN_EMIT_SYMBOL, 240),
+ (15, HUFFMAN_EMIT_SYMBOL, 240),
+ (24, HUFFMAN_EMIT_SYMBOL, 240),
+ (31, HUFFMAN_EMIT_SYMBOL, 240),
+ (41, HUFFMAN_EMIT_SYMBOL, 240),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 240),
+
+ # Node 203
+ (2, HUFFMAN_EMIT_SYMBOL, 242),
+ (9, HUFFMAN_EMIT_SYMBOL, 242),
+ (23, HUFFMAN_EMIT_SYMBOL, 242),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 242),
+ (2, HUFFMAN_EMIT_SYMBOL, 243),
+ (9, HUFFMAN_EMIT_SYMBOL, 243),
+ (23, HUFFMAN_EMIT_SYMBOL, 243),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 243),
+ (2, HUFFMAN_EMIT_SYMBOL, 255),
+ (9, HUFFMAN_EMIT_SYMBOL, 255),
+ (23, HUFFMAN_EMIT_SYMBOL, 255),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 255),
+ (1, HUFFMAN_EMIT_SYMBOL, 203),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 203),
+ (1, HUFFMAN_EMIT_SYMBOL, 204),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 204),
+
+ # Node 204
+ (3, HUFFMAN_EMIT_SYMBOL, 242),
+ (6, HUFFMAN_EMIT_SYMBOL, 242),
+ (10, HUFFMAN_EMIT_SYMBOL, 242),
+ (15, HUFFMAN_EMIT_SYMBOL, 242),
+ (24, HUFFMAN_EMIT_SYMBOL, 242),
+ (31, HUFFMAN_EMIT_SYMBOL, 242),
+ (41, HUFFMAN_EMIT_SYMBOL, 242),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 242),
+ (3, HUFFMAN_EMIT_SYMBOL, 243),
+ (6, HUFFMAN_EMIT_SYMBOL, 243),
+ (10, HUFFMAN_EMIT_SYMBOL, 243),
+ (15, HUFFMAN_EMIT_SYMBOL, 243),
+ (24, HUFFMAN_EMIT_SYMBOL, 243),
+ (31, HUFFMAN_EMIT_SYMBOL, 243),
+ (41, HUFFMAN_EMIT_SYMBOL, 243),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 243),
+
+ # Node 205
+ (3, HUFFMAN_EMIT_SYMBOL, 255),
+ (6, HUFFMAN_EMIT_SYMBOL, 255),
+ (10, HUFFMAN_EMIT_SYMBOL, 255),
+ (15, HUFFMAN_EMIT_SYMBOL, 255),
+ (24, HUFFMAN_EMIT_SYMBOL, 255),
+ (31, HUFFMAN_EMIT_SYMBOL, 255),
+ (41, HUFFMAN_EMIT_SYMBOL, 255),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 255),
+ (2, HUFFMAN_EMIT_SYMBOL, 203),
+ (9, HUFFMAN_EMIT_SYMBOL, 203),
+ (23, HUFFMAN_EMIT_SYMBOL, 203),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 203),
+ (2, HUFFMAN_EMIT_SYMBOL, 204),
+ (9, HUFFMAN_EMIT_SYMBOL, 204),
+ (23, HUFFMAN_EMIT_SYMBOL, 204),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 204),
+
+ # Node 206
+ (3, HUFFMAN_EMIT_SYMBOL, 203),
+ (6, HUFFMAN_EMIT_SYMBOL, 203),
+ (10, HUFFMAN_EMIT_SYMBOL, 203),
+ (15, HUFFMAN_EMIT_SYMBOL, 203),
+ (24, HUFFMAN_EMIT_SYMBOL, 203),
+ (31, HUFFMAN_EMIT_SYMBOL, 203),
+ (41, HUFFMAN_EMIT_SYMBOL, 203),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 203),
+ (3, HUFFMAN_EMIT_SYMBOL, 204),
+ (6, HUFFMAN_EMIT_SYMBOL, 204),
+ (10, HUFFMAN_EMIT_SYMBOL, 204),
+ (15, HUFFMAN_EMIT_SYMBOL, 204),
+ (24, HUFFMAN_EMIT_SYMBOL, 204),
+ (31, HUFFMAN_EMIT_SYMBOL, 204),
+ (41, HUFFMAN_EMIT_SYMBOL, 204),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 204),
+
+ # Node 207
+ (211, 0, 0),
+ (212, 0, 0),
+ (214, 0, 0),
+ (215, 0, 0),
+ (218, 0, 0),
+ (219, 0, 0),
+ (221, 0, 0),
+ (222, 0, 0),
+ (226, 0, 0),
+ (228, 0, 0),
+ (232, 0, 0),
+ (235, 0, 0),
+ (240, 0, 0),
+ (243, 0, 0),
+ (247, 0, 0),
+ (250, 0, 0),
+
+ # Node 208
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 211),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 212),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 214),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 221),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 222),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 223),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 241),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 244),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 245),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 246),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 247),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 248),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 250),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 251),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 252),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 253),
+
+ # Node 209
+ (1, HUFFMAN_EMIT_SYMBOL, 211),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 211),
+ (1, HUFFMAN_EMIT_SYMBOL, 212),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 212),
+ (1, HUFFMAN_EMIT_SYMBOL, 214),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 214),
+ (1, HUFFMAN_EMIT_SYMBOL, 221),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 221),
+ (1, HUFFMAN_EMIT_SYMBOL, 222),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 222),
+ (1, HUFFMAN_EMIT_SYMBOL, 223),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 223),
+ (1, HUFFMAN_EMIT_SYMBOL, 241),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 241),
+ (1, HUFFMAN_EMIT_SYMBOL, 244),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 244),
+
+ # Node 210
+ (2, HUFFMAN_EMIT_SYMBOL, 211),
+ (9, HUFFMAN_EMIT_SYMBOL, 211),
+ (23, HUFFMAN_EMIT_SYMBOL, 211),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 211),
+ (2, HUFFMAN_EMIT_SYMBOL, 212),
+ (9, HUFFMAN_EMIT_SYMBOL, 212),
+ (23, HUFFMAN_EMIT_SYMBOL, 212),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 212),
+ (2, HUFFMAN_EMIT_SYMBOL, 214),
+ (9, HUFFMAN_EMIT_SYMBOL, 214),
+ (23, HUFFMAN_EMIT_SYMBOL, 214),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 214),
+ (2, HUFFMAN_EMIT_SYMBOL, 221),
+ (9, HUFFMAN_EMIT_SYMBOL, 221),
+ (23, HUFFMAN_EMIT_SYMBOL, 221),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 221),
+
+ # Node 211
+ (3, HUFFMAN_EMIT_SYMBOL, 211),
+ (6, HUFFMAN_EMIT_SYMBOL, 211),
+ (10, HUFFMAN_EMIT_SYMBOL, 211),
+ (15, HUFFMAN_EMIT_SYMBOL, 211),
+ (24, HUFFMAN_EMIT_SYMBOL, 211),
+ (31, HUFFMAN_EMIT_SYMBOL, 211),
+ (41, HUFFMAN_EMIT_SYMBOL, 211),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 211),
+ (3, HUFFMAN_EMIT_SYMBOL, 212),
+ (6, HUFFMAN_EMIT_SYMBOL, 212),
+ (10, HUFFMAN_EMIT_SYMBOL, 212),
+ (15, HUFFMAN_EMIT_SYMBOL, 212),
+ (24, HUFFMAN_EMIT_SYMBOL, 212),
+ (31, HUFFMAN_EMIT_SYMBOL, 212),
+ (41, HUFFMAN_EMIT_SYMBOL, 212),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 212),
+
+ # Node 212
+ (3, HUFFMAN_EMIT_SYMBOL, 214),
+ (6, HUFFMAN_EMIT_SYMBOL, 214),
+ (10, HUFFMAN_EMIT_SYMBOL, 214),
+ (15, HUFFMAN_EMIT_SYMBOL, 214),
+ (24, HUFFMAN_EMIT_SYMBOL, 214),
+ (31, HUFFMAN_EMIT_SYMBOL, 214),
+ (41, HUFFMAN_EMIT_SYMBOL, 214),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 214),
+ (3, HUFFMAN_EMIT_SYMBOL, 221),
+ (6, HUFFMAN_EMIT_SYMBOL, 221),
+ (10, HUFFMAN_EMIT_SYMBOL, 221),
+ (15, HUFFMAN_EMIT_SYMBOL, 221),
+ (24, HUFFMAN_EMIT_SYMBOL, 221),
+ (31, HUFFMAN_EMIT_SYMBOL, 221),
+ (41, HUFFMAN_EMIT_SYMBOL, 221),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 221),
+
+ # Node 213
+ (2, HUFFMAN_EMIT_SYMBOL, 222),
+ (9, HUFFMAN_EMIT_SYMBOL, 222),
+ (23, HUFFMAN_EMIT_SYMBOL, 222),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 222),
+ (2, HUFFMAN_EMIT_SYMBOL, 223),
+ (9, HUFFMAN_EMIT_SYMBOL, 223),
+ (23, HUFFMAN_EMIT_SYMBOL, 223),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 223),
+ (2, HUFFMAN_EMIT_SYMBOL, 241),
+ (9, HUFFMAN_EMIT_SYMBOL, 241),
+ (23, HUFFMAN_EMIT_SYMBOL, 241),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 241),
+ (2, HUFFMAN_EMIT_SYMBOL, 244),
+ (9, HUFFMAN_EMIT_SYMBOL, 244),
+ (23, HUFFMAN_EMIT_SYMBOL, 244),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 244),
+
+ # Node 214
+ (3, HUFFMAN_EMIT_SYMBOL, 222),
+ (6, HUFFMAN_EMIT_SYMBOL, 222),
+ (10, HUFFMAN_EMIT_SYMBOL, 222),
+ (15, HUFFMAN_EMIT_SYMBOL, 222),
+ (24, HUFFMAN_EMIT_SYMBOL, 222),
+ (31, HUFFMAN_EMIT_SYMBOL, 222),
+ (41, HUFFMAN_EMIT_SYMBOL, 222),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 222),
+ (3, HUFFMAN_EMIT_SYMBOL, 223),
+ (6, HUFFMAN_EMIT_SYMBOL, 223),
+ (10, HUFFMAN_EMIT_SYMBOL, 223),
+ (15, HUFFMAN_EMIT_SYMBOL, 223),
+ (24, HUFFMAN_EMIT_SYMBOL, 223),
+ (31, HUFFMAN_EMIT_SYMBOL, 223),
+ (41, HUFFMAN_EMIT_SYMBOL, 223),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 223),
+
+ # Node 215
+ (3, HUFFMAN_EMIT_SYMBOL, 241),
+ (6, HUFFMAN_EMIT_SYMBOL, 241),
+ (10, HUFFMAN_EMIT_SYMBOL, 241),
+ (15, HUFFMAN_EMIT_SYMBOL, 241),
+ (24, HUFFMAN_EMIT_SYMBOL, 241),
+ (31, HUFFMAN_EMIT_SYMBOL, 241),
+ (41, HUFFMAN_EMIT_SYMBOL, 241),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 241),
+ (3, HUFFMAN_EMIT_SYMBOL, 244),
+ (6, HUFFMAN_EMIT_SYMBOL, 244),
+ (10, HUFFMAN_EMIT_SYMBOL, 244),
+ (15, HUFFMAN_EMIT_SYMBOL, 244),
+ (24, HUFFMAN_EMIT_SYMBOL, 244),
+ (31, HUFFMAN_EMIT_SYMBOL, 244),
+ (41, HUFFMAN_EMIT_SYMBOL, 244),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 244),
+
+ # Node 216
+ (1, HUFFMAN_EMIT_SYMBOL, 245),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 245),
+ (1, HUFFMAN_EMIT_SYMBOL, 246),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 246),
+ (1, HUFFMAN_EMIT_SYMBOL, 247),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 247),
+ (1, HUFFMAN_EMIT_SYMBOL, 248),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 248),
+ (1, HUFFMAN_EMIT_SYMBOL, 250),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 250),
+ (1, HUFFMAN_EMIT_SYMBOL, 251),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 251),
+ (1, HUFFMAN_EMIT_SYMBOL, 252),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 252),
+ (1, HUFFMAN_EMIT_SYMBOL, 253),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 253),
+
+ # Node 217
+ (2, HUFFMAN_EMIT_SYMBOL, 245),
+ (9, HUFFMAN_EMIT_SYMBOL, 245),
+ (23, HUFFMAN_EMIT_SYMBOL, 245),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 245),
+ (2, HUFFMAN_EMIT_SYMBOL, 246),
+ (9, HUFFMAN_EMIT_SYMBOL, 246),
+ (23, HUFFMAN_EMIT_SYMBOL, 246),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 246),
+ (2, HUFFMAN_EMIT_SYMBOL, 247),
+ (9, HUFFMAN_EMIT_SYMBOL, 247),
+ (23, HUFFMAN_EMIT_SYMBOL, 247),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 247),
+ (2, HUFFMAN_EMIT_SYMBOL, 248),
+ (9, HUFFMAN_EMIT_SYMBOL, 248),
+ (23, HUFFMAN_EMIT_SYMBOL, 248),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 248),
+
+ # Node 218
+ (3, HUFFMAN_EMIT_SYMBOL, 245),
+ (6, HUFFMAN_EMIT_SYMBOL, 245),
+ (10, HUFFMAN_EMIT_SYMBOL, 245),
+ (15, HUFFMAN_EMIT_SYMBOL, 245),
+ (24, HUFFMAN_EMIT_SYMBOL, 245),
+ (31, HUFFMAN_EMIT_SYMBOL, 245),
+ (41, HUFFMAN_EMIT_SYMBOL, 245),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 245),
+ (3, HUFFMAN_EMIT_SYMBOL, 246),
+ (6, HUFFMAN_EMIT_SYMBOL, 246),
+ (10, HUFFMAN_EMIT_SYMBOL, 246),
+ (15, HUFFMAN_EMIT_SYMBOL, 246),
+ (24, HUFFMAN_EMIT_SYMBOL, 246),
+ (31, HUFFMAN_EMIT_SYMBOL, 246),
+ (41, HUFFMAN_EMIT_SYMBOL, 246),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 246),
+
+ # Node 219
+ (3, HUFFMAN_EMIT_SYMBOL, 247),
+ (6, HUFFMAN_EMIT_SYMBOL, 247),
+ (10, HUFFMAN_EMIT_SYMBOL, 247),
+ (15, HUFFMAN_EMIT_SYMBOL, 247),
+ (24, HUFFMAN_EMIT_SYMBOL, 247),
+ (31, HUFFMAN_EMIT_SYMBOL, 247),
+ (41, HUFFMAN_EMIT_SYMBOL, 247),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 247),
+ (3, HUFFMAN_EMIT_SYMBOL, 248),
+ (6, HUFFMAN_EMIT_SYMBOL, 248),
+ (10, HUFFMAN_EMIT_SYMBOL, 248),
+ (15, HUFFMAN_EMIT_SYMBOL, 248),
+ (24, HUFFMAN_EMIT_SYMBOL, 248),
+ (31, HUFFMAN_EMIT_SYMBOL, 248),
+ (41, HUFFMAN_EMIT_SYMBOL, 248),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 248),
+
+ # Node 220
+ (2, HUFFMAN_EMIT_SYMBOL, 250),
+ (9, HUFFMAN_EMIT_SYMBOL, 250),
+ (23, HUFFMAN_EMIT_SYMBOL, 250),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 250),
+ (2, HUFFMAN_EMIT_SYMBOL, 251),
+ (9, HUFFMAN_EMIT_SYMBOL, 251),
+ (23, HUFFMAN_EMIT_SYMBOL, 251),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 251),
+ (2, HUFFMAN_EMIT_SYMBOL, 252),
+ (9, HUFFMAN_EMIT_SYMBOL, 252),
+ (23, HUFFMAN_EMIT_SYMBOL, 252),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 252),
+ (2, HUFFMAN_EMIT_SYMBOL, 253),
+ (9, HUFFMAN_EMIT_SYMBOL, 253),
+ (23, HUFFMAN_EMIT_SYMBOL, 253),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 253),
+
+ # Node 221
+ (3, HUFFMAN_EMIT_SYMBOL, 250),
+ (6, HUFFMAN_EMIT_SYMBOL, 250),
+ (10, HUFFMAN_EMIT_SYMBOL, 250),
+ (15, HUFFMAN_EMIT_SYMBOL, 250),
+ (24, HUFFMAN_EMIT_SYMBOL, 250),
+ (31, HUFFMAN_EMIT_SYMBOL, 250),
+ (41, HUFFMAN_EMIT_SYMBOL, 250),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 250),
+ (3, HUFFMAN_EMIT_SYMBOL, 251),
+ (6, HUFFMAN_EMIT_SYMBOL, 251),
+ (10, HUFFMAN_EMIT_SYMBOL, 251),
+ (15, HUFFMAN_EMIT_SYMBOL, 251),
+ (24, HUFFMAN_EMIT_SYMBOL, 251),
+ (31, HUFFMAN_EMIT_SYMBOL, 251),
+ (41, HUFFMAN_EMIT_SYMBOL, 251),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 251),
+
+ # Node 222
+ (3, HUFFMAN_EMIT_SYMBOL, 252),
+ (6, HUFFMAN_EMIT_SYMBOL, 252),
+ (10, HUFFMAN_EMIT_SYMBOL, 252),
+ (15, HUFFMAN_EMIT_SYMBOL, 252),
+ (24, HUFFMAN_EMIT_SYMBOL, 252),
+ (31, HUFFMAN_EMIT_SYMBOL, 252),
+ (41, HUFFMAN_EMIT_SYMBOL, 252),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 252),
+ (3, HUFFMAN_EMIT_SYMBOL, 253),
+ (6, HUFFMAN_EMIT_SYMBOL, 253),
+ (10, HUFFMAN_EMIT_SYMBOL, 253),
+ (15, HUFFMAN_EMIT_SYMBOL, 253),
+ (24, HUFFMAN_EMIT_SYMBOL, 253),
+ (31, HUFFMAN_EMIT_SYMBOL, 253),
+ (41, HUFFMAN_EMIT_SYMBOL, 253),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 253),
+
+ # Node 223
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 254),
+ (227, 0, 0),
+ (229, 0, 0),
+ (230, 0, 0),
+ (233, 0, 0),
+ (234, 0, 0),
+ (236, 0, 0),
+ (237, 0, 0),
+ (241, 0, 0),
+ (242, 0, 0),
+ (244, 0, 0),
+ (245, 0, 0),
+ (248, 0, 0),
+ (249, 0, 0),
+ (251, 0, 0),
+ (252, 0, 0),
+
+ # Node 224
+ (1, HUFFMAN_EMIT_SYMBOL, 254),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 254),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 2),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 3),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 4),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 5),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 6),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 7),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 8),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 11),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 12),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 14),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 15),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 16),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 17),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 18),
+
+ # Node 225
+ (2, HUFFMAN_EMIT_SYMBOL, 254),
+ (9, HUFFMAN_EMIT_SYMBOL, 254),
+ (23, HUFFMAN_EMIT_SYMBOL, 254),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 254),
+ (1, HUFFMAN_EMIT_SYMBOL, 2),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 2),
+ (1, HUFFMAN_EMIT_SYMBOL, 3),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 3),
+ (1, HUFFMAN_EMIT_SYMBOL, 4),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 4),
+ (1, HUFFMAN_EMIT_SYMBOL, 5),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 5),
+ (1, HUFFMAN_EMIT_SYMBOL, 6),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 6),
+ (1, HUFFMAN_EMIT_SYMBOL, 7),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 7),
+
+ # Node 226
+ (3, HUFFMAN_EMIT_SYMBOL, 254),
+ (6, HUFFMAN_EMIT_SYMBOL, 254),
+ (10, HUFFMAN_EMIT_SYMBOL, 254),
+ (15, HUFFMAN_EMIT_SYMBOL, 254),
+ (24, HUFFMAN_EMIT_SYMBOL, 254),
+ (31, HUFFMAN_EMIT_SYMBOL, 254),
+ (41, HUFFMAN_EMIT_SYMBOL, 254),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 254),
+ (2, HUFFMAN_EMIT_SYMBOL, 2),
+ (9, HUFFMAN_EMIT_SYMBOL, 2),
+ (23, HUFFMAN_EMIT_SYMBOL, 2),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 2),
+ (2, HUFFMAN_EMIT_SYMBOL, 3),
+ (9, HUFFMAN_EMIT_SYMBOL, 3),
+ (23, HUFFMAN_EMIT_SYMBOL, 3),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 3),
+
+ # Node 227
+ (3, HUFFMAN_EMIT_SYMBOL, 2),
+ (6, HUFFMAN_EMIT_SYMBOL, 2),
+ (10, HUFFMAN_EMIT_SYMBOL, 2),
+ (15, HUFFMAN_EMIT_SYMBOL, 2),
+ (24, HUFFMAN_EMIT_SYMBOL, 2),
+ (31, HUFFMAN_EMIT_SYMBOL, 2),
+ (41, HUFFMAN_EMIT_SYMBOL, 2),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 2),
+ (3, HUFFMAN_EMIT_SYMBOL, 3),
+ (6, HUFFMAN_EMIT_SYMBOL, 3),
+ (10, HUFFMAN_EMIT_SYMBOL, 3),
+ (15, HUFFMAN_EMIT_SYMBOL, 3),
+ (24, HUFFMAN_EMIT_SYMBOL, 3),
+ (31, HUFFMAN_EMIT_SYMBOL, 3),
+ (41, HUFFMAN_EMIT_SYMBOL, 3),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 3),
+
+ # Node 228
+ (2, HUFFMAN_EMIT_SYMBOL, 4),
+ (9, HUFFMAN_EMIT_SYMBOL, 4),
+ (23, HUFFMAN_EMIT_SYMBOL, 4),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 4),
+ (2, HUFFMAN_EMIT_SYMBOL, 5),
+ (9, HUFFMAN_EMIT_SYMBOL, 5),
+ (23, HUFFMAN_EMIT_SYMBOL, 5),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 5),
+ (2, HUFFMAN_EMIT_SYMBOL, 6),
+ (9, HUFFMAN_EMIT_SYMBOL, 6),
+ (23, HUFFMAN_EMIT_SYMBOL, 6),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 6),
+ (2, HUFFMAN_EMIT_SYMBOL, 7),
+ (9, HUFFMAN_EMIT_SYMBOL, 7),
+ (23, HUFFMAN_EMIT_SYMBOL, 7),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 7),
+
+ # Node 229
+ (3, HUFFMAN_EMIT_SYMBOL, 4),
+ (6, HUFFMAN_EMIT_SYMBOL, 4),
+ (10, HUFFMAN_EMIT_SYMBOL, 4),
+ (15, HUFFMAN_EMIT_SYMBOL, 4),
+ (24, HUFFMAN_EMIT_SYMBOL, 4),
+ (31, HUFFMAN_EMIT_SYMBOL, 4),
+ (41, HUFFMAN_EMIT_SYMBOL, 4),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 4),
+ (3, HUFFMAN_EMIT_SYMBOL, 5),
+ (6, HUFFMAN_EMIT_SYMBOL, 5),
+ (10, HUFFMAN_EMIT_SYMBOL, 5),
+ (15, HUFFMAN_EMIT_SYMBOL, 5),
+ (24, HUFFMAN_EMIT_SYMBOL, 5),
+ (31, HUFFMAN_EMIT_SYMBOL, 5),
+ (41, HUFFMAN_EMIT_SYMBOL, 5),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 5),
+
+ # Node 230
+ (3, HUFFMAN_EMIT_SYMBOL, 6),
+ (6, HUFFMAN_EMIT_SYMBOL, 6),
+ (10, HUFFMAN_EMIT_SYMBOL, 6),
+ (15, HUFFMAN_EMIT_SYMBOL, 6),
+ (24, HUFFMAN_EMIT_SYMBOL, 6),
+ (31, HUFFMAN_EMIT_SYMBOL, 6),
+ (41, HUFFMAN_EMIT_SYMBOL, 6),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 6),
+ (3, HUFFMAN_EMIT_SYMBOL, 7),
+ (6, HUFFMAN_EMIT_SYMBOL, 7),
+ (10, HUFFMAN_EMIT_SYMBOL, 7),
+ (15, HUFFMAN_EMIT_SYMBOL, 7),
+ (24, HUFFMAN_EMIT_SYMBOL, 7),
+ (31, HUFFMAN_EMIT_SYMBOL, 7),
+ (41, HUFFMAN_EMIT_SYMBOL, 7),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 7),
+
+ # Node 231
+ (1, HUFFMAN_EMIT_SYMBOL, 8),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 8),
+ (1, HUFFMAN_EMIT_SYMBOL, 11),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 11),
+ (1, HUFFMAN_EMIT_SYMBOL, 12),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 12),
+ (1, HUFFMAN_EMIT_SYMBOL, 14),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 14),
+ (1, HUFFMAN_EMIT_SYMBOL, 15),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 15),
+ (1, HUFFMAN_EMIT_SYMBOL, 16),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 16),
+ (1, HUFFMAN_EMIT_SYMBOL, 17),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 17),
+ (1, HUFFMAN_EMIT_SYMBOL, 18),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 18),
+
+ # Node 232
+ (2, HUFFMAN_EMIT_SYMBOL, 8),
+ (9, HUFFMAN_EMIT_SYMBOL, 8),
+ (23, HUFFMAN_EMIT_SYMBOL, 8),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 8),
+ (2, HUFFMAN_EMIT_SYMBOL, 11),
+ (9, HUFFMAN_EMIT_SYMBOL, 11),
+ (23, HUFFMAN_EMIT_SYMBOL, 11),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 11),
+ (2, HUFFMAN_EMIT_SYMBOL, 12),
+ (9, HUFFMAN_EMIT_SYMBOL, 12),
+ (23, HUFFMAN_EMIT_SYMBOL, 12),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 12),
+ (2, HUFFMAN_EMIT_SYMBOL, 14),
+ (9, HUFFMAN_EMIT_SYMBOL, 14),
+ (23, HUFFMAN_EMIT_SYMBOL, 14),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 14),
+
+ # Node 233
+ (3, HUFFMAN_EMIT_SYMBOL, 8),
+ (6, HUFFMAN_EMIT_SYMBOL, 8),
+ (10, HUFFMAN_EMIT_SYMBOL, 8),
+ (15, HUFFMAN_EMIT_SYMBOL, 8),
+ (24, HUFFMAN_EMIT_SYMBOL, 8),
+ (31, HUFFMAN_EMIT_SYMBOL, 8),
+ (41, HUFFMAN_EMIT_SYMBOL, 8),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 8),
+ (3, HUFFMAN_EMIT_SYMBOL, 11),
+ (6, HUFFMAN_EMIT_SYMBOL, 11),
+ (10, HUFFMAN_EMIT_SYMBOL, 11),
+ (15, HUFFMAN_EMIT_SYMBOL, 11),
+ (24, HUFFMAN_EMIT_SYMBOL, 11),
+ (31, HUFFMAN_EMIT_SYMBOL, 11),
+ (41, HUFFMAN_EMIT_SYMBOL, 11),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 11),
+
+ # Node 234
+ (3, HUFFMAN_EMIT_SYMBOL, 12),
+ (6, HUFFMAN_EMIT_SYMBOL, 12),
+ (10, HUFFMAN_EMIT_SYMBOL, 12),
+ (15, HUFFMAN_EMIT_SYMBOL, 12),
+ (24, HUFFMAN_EMIT_SYMBOL, 12),
+ (31, HUFFMAN_EMIT_SYMBOL, 12),
+ (41, HUFFMAN_EMIT_SYMBOL, 12),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 12),
+ (3, HUFFMAN_EMIT_SYMBOL, 14),
+ (6, HUFFMAN_EMIT_SYMBOL, 14),
+ (10, HUFFMAN_EMIT_SYMBOL, 14),
+ (15, HUFFMAN_EMIT_SYMBOL, 14),
+ (24, HUFFMAN_EMIT_SYMBOL, 14),
+ (31, HUFFMAN_EMIT_SYMBOL, 14),
+ (41, HUFFMAN_EMIT_SYMBOL, 14),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 14),
+
+ # Node 235
+ (2, HUFFMAN_EMIT_SYMBOL, 15),
+ (9, HUFFMAN_EMIT_SYMBOL, 15),
+ (23, HUFFMAN_EMIT_SYMBOL, 15),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 15),
+ (2, HUFFMAN_EMIT_SYMBOL, 16),
+ (9, HUFFMAN_EMIT_SYMBOL, 16),
+ (23, HUFFMAN_EMIT_SYMBOL, 16),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 16),
+ (2, HUFFMAN_EMIT_SYMBOL, 17),
+ (9, HUFFMAN_EMIT_SYMBOL, 17),
+ (23, HUFFMAN_EMIT_SYMBOL, 17),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 17),
+ (2, HUFFMAN_EMIT_SYMBOL, 18),
+ (9, HUFFMAN_EMIT_SYMBOL, 18),
+ (23, HUFFMAN_EMIT_SYMBOL, 18),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 18),
+
+ # Node 236
+ (3, HUFFMAN_EMIT_SYMBOL, 15),
+ (6, HUFFMAN_EMIT_SYMBOL, 15),
+ (10, HUFFMAN_EMIT_SYMBOL, 15),
+ (15, HUFFMAN_EMIT_SYMBOL, 15),
+ (24, HUFFMAN_EMIT_SYMBOL, 15),
+ (31, HUFFMAN_EMIT_SYMBOL, 15),
+ (41, HUFFMAN_EMIT_SYMBOL, 15),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 15),
+ (3, HUFFMAN_EMIT_SYMBOL, 16),
+ (6, HUFFMAN_EMIT_SYMBOL, 16),
+ (10, HUFFMAN_EMIT_SYMBOL, 16),
+ (15, HUFFMAN_EMIT_SYMBOL, 16),
+ (24, HUFFMAN_EMIT_SYMBOL, 16),
+ (31, HUFFMAN_EMIT_SYMBOL, 16),
+ (41, HUFFMAN_EMIT_SYMBOL, 16),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 16),
+
+ # Node 237
+ (3, HUFFMAN_EMIT_SYMBOL, 17),
+ (6, HUFFMAN_EMIT_SYMBOL, 17),
+ (10, HUFFMAN_EMIT_SYMBOL, 17),
+ (15, HUFFMAN_EMIT_SYMBOL, 17),
+ (24, HUFFMAN_EMIT_SYMBOL, 17),
+ (31, HUFFMAN_EMIT_SYMBOL, 17),
+ (41, HUFFMAN_EMIT_SYMBOL, 17),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 17),
+ (3, HUFFMAN_EMIT_SYMBOL, 18),
+ (6, HUFFMAN_EMIT_SYMBOL, 18),
+ (10, HUFFMAN_EMIT_SYMBOL, 18),
+ (15, HUFFMAN_EMIT_SYMBOL, 18),
+ (24, HUFFMAN_EMIT_SYMBOL, 18),
+ (31, HUFFMAN_EMIT_SYMBOL, 18),
+ (41, HUFFMAN_EMIT_SYMBOL, 18),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 18),
+
+ # Node 238
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 19),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 20),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 21),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 23),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 24),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 25),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 26),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 27),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 28),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 29),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 30),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 31),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 127),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 220),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 249),
+ (253, 0, 0),
+
+ # Node 239
+ (1, HUFFMAN_EMIT_SYMBOL, 19),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 19),
+ (1, HUFFMAN_EMIT_SYMBOL, 20),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 20),
+ (1, HUFFMAN_EMIT_SYMBOL, 21),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 21),
+ (1, HUFFMAN_EMIT_SYMBOL, 23),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 23),
+ (1, HUFFMAN_EMIT_SYMBOL, 24),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 24),
+ (1, HUFFMAN_EMIT_SYMBOL, 25),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 25),
+ (1, HUFFMAN_EMIT_SYMBOL, 26),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 26),
+ (1, HUFFMAN_EMIT_SYMBOL, 27),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 27),
+
+ # Node 240
+ (2, HUFFMAN_EMIT_SYMBOL, 19),
+ (9, HUFFMAN_EMIT_SYMBOL, 19),
+ (23, HUFFMAN_EMIT_SYMBOL, 19),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 19),
+ (2, HUFFMAN_EMIT_SYMBOL, 20),
+ (9, HUFFMAN_EMIT_SYMBOL, 20),
+ (23, HUFFMAN_EMIT_SYMBOL, 20),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 20),
+ (2, HUFFMAN_EMIT_SYMBOL, 21),
+ (9, HUFFMAN_EMIT_SYMBOL, 21),
+ (23, HUFFMAN_EMIT_SYMBOL, 21),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 21),
+ (2, HUFFMAN_EMIT_SYMBOL, 23),
+ (9, HUFFMAN_EMIT_SYMBOL, 23),
+ (23, HUFFMAN_EMIT_SYMBOL, 23),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 23),
+
+ # Node 241
+ (3, HUFFMAN_EMIT_SYMBOL, 19),
+ (6, HUFFMAN_EMIT_SYMBOL, 19),
+ (10, HUFFMAN_EMIT_SYMBOL, 19),
+ (15, HUFFMAN_EMIT_SYMBOL, 19),
+ (24, HUFFMAN_EMIT_SYMBOL, 19),
+ (31, HUFFMAN_EMIT_SYMBOL, 19),
+ (41, HUFFMAN_EMIT_SYMBOL, 19),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 19),
+ (3, HUFFMAN_EMIT_SYMBOL, 20),
+ (6, HUFFMAN_EMIT_SYMBOL, 20),
+ (10, HUFFMAN_EMIT_SYMBOL, 20),
+ (15, HUFFMAN_EMIT_SYMBOL, 20),
+ (24, HUFFMAN_EMIT_SYMBOL, 20),
+ (31, HUFFMAN_EMIT_SYMBOL, 20),
+ (41, HUFFMAN_EMIT_SYMBOL, 20),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 20),
+
+ # Node 242
+ (3, HUFFMAN_EMIT_SYMBOL, 21),
+ (6, HUFFMAN_EMIT_SYMBOL, 21),
+ (10, HUFFMAN_EMIT_SYMBOL, 21),
+ (15, HUFFMAN_EMIT_SYMBOL, 21),
+ (24, HUFFMAN_EMIT_SYMBOL, 21),
+ (31, HUFFMAN_EMIT_SYMBOL, 21),
+ (41, HUFFMAN_EMIT_SYMBOL, 21),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 21),
+ (3, HUFFMAN_EMIT_SYMBOL, 23),
+ (6, HUFFMAN_EMIT_SYMBOL, 23),
+ (10, HUFFMAN_EMIT_SYMBOL, 23),
+ (15, HUFFMAN_EMIT_SYMBOL, 23),
+ (24, HUFFMAN_EMIT_SYMBOL, 23),
+ (31, HUFFMAN_EMIT_SYMBOL, 23),
+ (41, HUFFMAN_EMIT_SYMBOL, 23),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 23),
+
+ # Node 243
+ (2, HUFFMAN_EMIT_SYMBOL, 24),
+ (9, HUFFMAN_EMIT_SYMBOL, 24),
+ (23, HUFFMAN_EMIT_SYMBOL, 24),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 24),
+ (2, HUFFMAN_EMIT_SYMBOL, 25),
+ (9, HUFFMAN_EMIT_SYMBOL, 25),
+ (23, HUFFMAN_EMIT_SYMBOL, 25),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 25),
+ (2, HUFFMAN_EMIT_SYMBOL, 26),
+ (9, HUFFMAN_EMIT_SYMBOL, 26),
+ (23, HUFFMAN_EMIT_SYMBOL, 26),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 26),
+ (2, HUFFMAN_EMIT_SYMBOL, 27),
+ (9, HUFFMAN_EMIT_SYMBOL, 27),
+ (23, HUFFMAN_EMIT_SYMBOL, 27),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 27),
+
+ # Node 244
+ (3, HUFFMAN_EMIT_SYMBOL, 24),
+ (6, HUFFMAN_EMIT_SYMBOL, 24),
+ (10, HUFFMAN_EMIT_SYMBOL, 24),
+ (15, HUFFMAN_EMIT_SYMBOL, 24),
+ (24, HUFFMAN_EMIT_SYMBOL, 24),
+ (31, HUFFMAN_EMIT_SYMBOL, 24),
+ (41, HUFFMAN_EMIT_SYMBOL, 24),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 24),
+ (3, HUFFMAN_EMIT_SYMBOL, 25),
+ (6, HUFFMAN_EMIT_SYMBOL, 25),
+ (10, HUFFMAN_EMIT_SYMBOL, 25),
+ (15, HUFFMAN_EMIT_SYMBOL, 25),
+ (24, HUFFMAN_EMIT_SYMBOL, 25),
+ (31, HUFFMAN_EMIT_SYMBOL, 25),
+ (41, HUFFMAN_EMIT_SYMBOL, 25),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 25),
+
+ # Node 245
+ (3, HUFFMAN_EMIT_SYMBOL, 26),
+ (6, HUFFMAN_EMIT_SYMBOL, 26),
+ (10, HUFFMAN_EMIT_SYMBOL, 26),
+ (15, HUFFMAN_EMIT_SYMBOL, 26),
+ (24, HUFFMAN_EMIT_SYMBOL, 26),
+ (31, HUFFMAN_EMIT_SYMBOL, 26),
+ (41, HUFFMAN_EMIT_SYMBOL, 26),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 26),
+ (3, HUFFMAN_EMIT_SYMBOL, 27),
+ (6, HUFFMAN_EMIT_SYMBOL, 27),
+ (10, HUFFMAN_EMIT_SYMBOL, 27),
+ (15, HUFFMAN_EMIT_SYMBOL, 27),
+ (24, HUFFMAN_EMIT_SYMBOL, 27),
+ (31, HUFFMAN_EMIT_SYMBOL, 27),
+ (41, HUFFMAN_EMIT_SYMBOL, 27),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 27),
+
+ # Node 246
+ (1, HUFFMAN_EMIT_SYMBOL, 28),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 28),
+ (1, HUFFMAN_EMIT_SYMBOL, 29),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 29),
+ (1, HUFFMAN_EMIT_SYMBOL, 30),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 30),
+ (1, HUFFMAN_EMIT_SYMBOL, 31),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 31),
+ (1, HUFFMAN_EMIT_SYMBOL, 127),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 127),
+ (1, HUFFMAN_EMIT_SYMBOL, 220),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 220),
+ (1, HUFFMAN_EMIT_SYMBOL, 249),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 249),
+ (254, 0, 0),
+ (255, 0, 0),
+
+ # Node 247
+ (2, HUFFMAN_EMIT_SYMBOL, 28),
+ (9, HUFFMAN_EMIT_SYMBOL, 28),
+ (23, HUFFMAN_EMIT_SYMBOL, 28),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 28),
+ (2, HUFFMAN_EMIT_SYMBOL, 29),
+ (9, HUFFMAN_EMIT_SYMBOL, 29),
+ (23, HUFFMAN_EMIT_SYMBOL, 29),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 29),
+ (2, HUFFMAN_EMIT_SYMBOL, 30),
+ (9, HUFFMAN_EMIT_SYMBOL, 30),
+ (23, HUFFMAN_EMIT_SYMBOL, 30),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 30),
+ (2, HUFFMAN_EMIT_SYMBOL, 31),
+ (9, HUFFMAN_EMIT_SYMBOL, 31),
+ (23, HUFFMAN_EMIT_SYMBOL, 31),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 31),
+
+ # Node 248
+ (3, HUFFMAN_EMIT_SYMBOL, 28),
+ (6, HUFFMAN_EMIT_SYMBOL, 28),
+ (10, HUFFMAN_EMIT_SYMBOL, 28),
+ (15, HUFFMAN_EMIT_SYMBOL, 28),
+ (24, HUFFMAN_EMIT_SYMBOL, 28),
+ (31, HUFFMAN_EMIT_SYMBOL, 28),
+ (41, HUFFMAN_EMIT_SYMBOL, 28),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 28),
+ (3, HUFFMAN_EMIT_SYMBOL, 29),
+ (6, HUFFMAN_EMIT_SYMBOL, 29),
+ (10, HUFFMAN_EMIT_SYMBOL, 29),
+ (15, HUFFMAN_EMIT_SYMBOL, 29),
+ (24, HUFFMAN_EMIT_SYMBOL, 29),
+ (31, HUFFMAN_EMIT_SYMBOL, 29),
+ (41, HUFFMAN_EMIT_SYMBOL, 29),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 29),
+
+ # Node 249
+ (3, HUFFMAN_EMIT_SYMBOL, 30),
+ (6, HUFFMAN_EMIT_SYMBOL, 30),
+ (10, HUFFMAN_EMIT_SYMBOL, 30),
+ (15, HUFFMAN_EMIT_SYMBOL, 30),
+ (24, HUFFMAN_EMIT_SYMBOL, 30),
+ (31, HUFFMAN_EMIT_SYMBOL, 30),
+ (41, HUFFMAN_EMIT_SYMBOL, 30),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 30),
+ (3, HUFFMAN_EMIT_SYMBOL, 31),
+ (6, HUFFMAN_EMIT_SYMBOL, 31),
+ (10, HUFFMAN_EMIT_SYMBOL, 31),
+ (15, HUFFMAN_EMIT_SYMBOL, 31),
+ (24, HUFFMAN_EMIT_SYMBOL, 31),
+ (31, HUFFMAN_EMIT_SYMBOL, 31),
+ (41, HUFFMAN_EMIT_SYMBOL, 31),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 31),
+
+ # Node 250
+ (2, HUFFMAN_EMIT_SYMBOL, 127),
+ (9, HUFFMAN_EMIT_SYMBOL, 127),
+ (23, HUFFMAN_EMIT_SYMBOL, 127),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 127),
+ (2, HUFFMAN_EMIT_SYMBOL, 220),
+ (9, HUFFMAN_EMIT_SYMBOL, 220),
+ (23, HUFFMAN_EMIT_SYMBOL, 220),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 220),
+ (2, HUFFMAN_EMIT_SYMBOL, 249),
+ (9, HUFFMAN_EMIT_SYMBOL, 249),
+ (23, HUFFMAN_EMIT_SYMBOL, 249),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 249),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 10),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 13),
+ (0, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 22),
+ (0, HUFFMAN_FAIL, 0),
+
+ # Node 251
+ (3, HUFFMAN_EMIT_SYMBOL, 127),
+ (6, HUFFMAN_EMIT_SYMBOL, 127),
+ (10, HUFFMAN_EMIT_SYMBOL, 127),
+ (15, HUFFMAN_EMIT_SYMBOL, 127),
+ (24, HUFFMAN_EMIT_SYMBOL, 127),
+ (31, HUFFMAN_EMIT_SYMBOL, 127),
+ (41, HUFFMAN_EMIT_SYMBOL, 127),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 127),
+ (3, HUFFMAN_EMIT_SYMBOL, 220),
+ (6, HUFFMAN_EMIT_SYMBOL, 220),
+ (10, HUFFMAN_EMIT_SYMBOL, 220),
+ (15, HUFFMAN_EMIT_SYMBOL, 220),
+ (24, HUFFMAN_EMIT_SYMBOL, 220),
+ (31, HUFFMAN_EMIT_SYMBOL, 220),
+ (41, HUFFMAN_EMIT_SYMBOL, 220),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 220),
+
+ # Node 252
+ (3, HUFFMAN_EMIT_SYMBOL, 249),
+ (6, HUFFMAN_EMIT_SYMBOL, 249),
+ (10, HUFFMAN_EMIT_SYMBOL, 249),
+ (15, HUFFMAN_EMIT_SYMBOL, 249),
+ (24, HUFFMAN_EMIT_SYMBOL, 249),
+ (31, HUFFMAN_EMIT_SYMBOL, 249),
+ (41, HUFFMAN_EMIT_SYMBOL, 249),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 249),
+ (1, HUFFMAN_EMIT_SYMBOL, 10),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 10),
+ (1, HUFFMAN_EMIT_SYMBOL, 13),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 13),
+ (1, HUFFMAN_EMIT_SYMBOL, 22),
+ (22, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 22),
+ (0, HUFFMAN_FAIL, 0),
+ (0, HUFFMAN_FAIL, 0),
+
+ # Node 253
+ (2, HUFFMAN_EMIT_SYMBOL, 10),
+ (9, HUFFMAN_EMIT_SYMBOL, 10),
+ (23, HUFFMAN_EMIT_SYMBOL, 10),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 10),
+ (2, HUFFMAN_EMIT_SYMBOL, 13),
+ (9, HUFFMAN_EMIT_SYMBOL, 13),
+ (23, HUFFMAN_EMIT_SYMBOL, 13),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 13),
+ (2, HUFFMAN_EMIT_SYMBOL, 22),
+ (9, HUFFMAN_EMIT_SYMBOL, 22),
+ (23, HUFFMAN_EMIT_SYMBOL, 22),
+ (40, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 22),
+ (0, HUFFMAN_FAIL, 0),
+ (0, HUFFMAN_FAIL, 0),
+ (0, HUFFMAN_FAIL, 0),
+ (0, HUFFMAN_FAIL, 0),
+
+ # Node 254
+ (3, HUFFMAN_EMIT_SYMBOL, 10),
+ (6, HUFFMAN_EMIT_SYMBOL, 10),
+ (10, HUFFMAN_EMIT_SYMBOL, 10),
+ (15, HUFFMAN_EMIT_SYMBOL, 10),
+ (24, HUFFMAN_EMIT_SYMBOL, 10),
+ (31, HUFFMAN_EMIT_SYMBOL, 10),
+ (41, HUFFMAN_EMIT_SYMBOL, 10),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 10),
+ (3, HUFFMAN_EMIT_SYMBOL, 13),
+ (6, HUFFMAN_EMIT_SYMBOL, 13),
+ (10, HUFFMAN_EMIT_SYMBOL, 13),
+ (15, HUFFMAN_EMIT_SYMBOL, 13),
+ (24, HUFFMAN_EMIT_SYMBOL, 13),
+ (31, HUFFMAN_EMIT_SYMBOL, 13),
+ (41, HUFFMAN_EMIT_SYMBOL, 13),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 13),
+
+ # Node 255
+ (3, HUFFMAN_EMIT_SYMBOL, 22),
+ (6, HUFFMAN_EMIT_SYMBOL, 22),
+ (10, HUFFMAN_EMIT_SYMBOL, 22),
+ (15, HUFFMAN_EMIT_SYMBOL, 22),
+ (24, HUFFMAN_EMIT_SYMBOL, 22),
+ (31, HUFFMAN_EMIT_SYMBOL, 22),
+ (41, HUFFMAN_EMIT_SYMBOL, 22),
+ (56, HUFFMAN_COMPLETE | HUFFMAN_EMIT_SYMBOL, 22),
+ (0, HUFFMAN_FAIL, 0),
+ (0, HUFFMAN_FAIL, 0),
+ (0, HUFFMAN_FAIL, 0),
+ (0, HUFFMAN_FAIL, 0),
+ (0, HUFFMAN_FAIL, 0),
+ (0, HUFFMAN_FAIL, 0),
+ (0, HUFFMAN_FAIL, 0),
+ (0, HUFFMAN_FAIL, 0),
+]
diff --git a/.venv/lib/python3.9/site-packages/hpack/struct.py b/.venv/lib/python3.9/site-packages/hpack/struct.py
new file mode 100644
index 0000000..fcab929
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hpack/struct.py
@@ -0,0 +1,39 @@
+# -*- coding: utf-8 -*-
+"""
+hpack/struct
+~~~~~~~~~~~~
+
+Contains structures for representing header fields with associated metadata.
+"""
+
+
+class HeaderTuple(tuple):
+ """
+ A data structure that stores a single header field.
+
+ HTTP headers can be thought of as tuples of ``(field name, field value)``.
+ A single header block is a sequence of such tuples.
+
+ In HTTP/2, however, certain bits of additional information are required for
+ compressing these headers: in particular, whether the header field can be
+ safely added to the HPACK compression context.
+
+ This class stores a header that can be added to the compression context. In
+ all other ways it behaves exactly like a tuple.
+ """
+ __slots__ = ()
+
+ indexable = True
+
+ def __new__(cls, *args):
+ return tuple.__new__(cls, args)
+
+
+class NeverIndexedHeaderTuple(HeaderTuple):
+ """
+ A data structure that stores a single header field that cannot be added to
+ a HTTP/2 header compression context.
+ """
+ __slots__ = ()
+
+ indexable = False
diff --git a/.venv/lib/python3.9/site-packages/hpack/table.py b/.venv/lib/python3.9/site-packages/hpack/table.py
new file mode 100644
index 0000000..2b656f3
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hpack/table.py
@@ -0,0 +1,235 @@
+# -*- coding: utf-8 -*-
+# flake8: noqa
+from collections import deque
+import logging
+
+from .exceptions import InvalidTableIndex
+
+log = logging.getLogger(__name__)
+
+
+def table_entry_size(name, value):
+ """
+ Calculates the size of a single entry
+
+ This size is mostly irrelevant to us and defined
+ specifically to accommodate memory management for
+ lower level implementations. The 32 extra bytes are
+ considered the "maximum" overhead that would be
+ required to represent each entry in the table.
+
+ See RFC7541 Section 4.1
+ """
+ return 32 + len(name) + len(value)
+
+
+class HeaderTable:
+ """
+ Implements the combined static and dynamic header table
+
+ The name and value arguments for all the functions
+ should ONLY be byte strings (b'') however this is not
+ strictly enforced in the interface.
+
+ See RFC7541 Section 2.3
+ """
+ #: Default maximum size of the dynamic table. See
+ #: RFC7540 Section 6.5.2.
+ DEFAULT_SIZE = 4096
+
+ #: Constant list of static headers. See RFC7541 Section
+ #: 2.3.1 and Appendix A
+ STATIC_TABLE = (
+ (b':authority' , b'' ), # noqa
+ (b':method' , b'GET' ), # noqa
+ (b':method' , b'POST' ), # noqa
+ (b':path' , b'/' ), # noqa
+ (b':path' , b'/index.html' ), # noqa
+ (b':scheme' , b'http' ), # noqa
+ (b':scheme' , b'https' ), # noqa
+ (b':status' , b'200' ), # noqa
+ (b':status' , b'204' ), # noqa
+ (b':status' , b'206' ), # noqa
+ (b':status' , b'304' ), # noqa
+ (b':status' , b'400' ), # noqa
+ (b':status' , b'404' ), # noqa
+ (b':status' , b'500' ), # noqa
+ (b'accept-charset' , b'' ), # noqa
+ (b'accept-encoding' , b'gzip, deflate'), # noqa
+ (b'accept-language' , b'' ), # noqa
+ (b'accept-ranges' , b'' ), # noqa
+ (b'accept' , b'' ), # noqa
+ (b'access-control-allow-origin' , b'' ), # noqa
+ (b'age' , b'' ), # noqa
+ (b'allow' , b'' ), # noqa
+ (b'authorization' , b'' ), # noqa
+ (b'cache-control' , b'' ), # noqa
+ (b'content-disposition' , b'' ), # noqa
+ (b'content-encoding' , b'' ), # noqa
+ (b'content-language' , b'' ), # noqa
+ (b'content-length' , b'' ), # noqa
+ (b'content-location' , b'' ), # noqa
+ (b'content-range' , b'' ), # noqa
+ (b'content-type' , b'' ), # noqa
+ (b'cookie' , b'' ), # noqa
+ (b'date' , b'' ), # noqa
+ (b'etag' , b'' ), # noqa
+ (b'expect' , b'' ), # noqa
+ (b'expires' , b'' ), # noqa
+ (b'from' , b'' ), # noqa
+ (b'host' , b'' ), # noqa
+ (b'if-match' , b'' ), # noqa
+ (b'if-modified-since' , b'' ), # noqa
+ (b'if-none-match' , b'' ), # noqa
+ (b'if-range' , b'' ), # noqa
+ (b'if-unmodified-since' , b'' ), # noqa
+ (b'last-modified' , b'' ), # noqa
+ (b'link' , b'' ), # noqa
+ (b'location' , b'' ), # noqa
+ (b'max-forwards' , b'' ), # noqa
+ (b'proxy-authenticate' , b'' ), # noqa
+ (b'proxy-authorization' , b'' ), # noqa
+ (b'range' , b'' ), # noqa
+ (b'referer' , b'' ), # noqa
+ (b'refresh' , b'' ), # noqa
+ (b'retry-after' , b'' ), # noqa
+ (b'server' , b'' ), # noqa
+ (b'set-cookie' , b'' ), # noqa
+ (b'strict-transport-security' , b'' ), # noqa
+ (b'transfer-encoding' , b'' ), # noqa
+ (b'user-agent' , b'' ), # noqa
+ (b'vary' , b'' ), # noqa
+ (b'via' , b'' ), # noqa
+ (b'www-authenticate' , b'' ), # noqa
+ ) # noqa
+
+ STATIC_TABLE_LENGTH = len(STATIC_TABLE)
+
+ def __init__(self):
+ self._maxsize = HeaderTable.DEFAULT_SIZE
+ self._current_size = 0
+ self.resized = False
+ self.dynamic_entries = deque()
+
+ def get_by_index(self, index):
+ """
+ Returns the entry specified by index
+
+ Note that the table is 1-based ie an index of 0 is
+ invalid. This is due to the fact that a zero value
+ index signals that a completely unindexed header
+ follows.
+
+ The entry will either be from the static table or
+ the dynamic table depending on the value of index.
+ """
+ original_index = index
+ index -= 1
+ if 0 <= index:
+ if index < HeaderTable.STATIC_TABLE_LENGTH:
+ return HeaderTable.STATIC_TABLE[index]
+
+ index -= HeaderTable.STATIC_TABLE_LENGTH
+ if index < len(self.dynamic_entries):
+ return self.dynamic_entries[index]
+
+ raise InvalidTableIndex("Invalid table index %d" % original_index)
+
+ def __repr__(self):
+ return "HeaderTable(%d, %s, %r)" % (
+ self._maxsize,
+ self.resized,
+ self.dynamic_entries
+ )
+
+ def add(self, name, value):
+ """
+ Adds a new entry to the table
+
+ We reduce the table size if the entry will make the
+ table size greater than maxsize.
+ """
+ # We just clear the table if the entry is too big
+ size = table_entry_size(name, value)
+ if size > self._maxsize:
+ self.dynamic_entries.clear()
+ self._current_size = 0
+ else:
+ # Add new entry
+ self.dynamic_entries.appendleft((name, value))
+ self._current_size += size
+ self._shrink()
+
+ def search(self, name, value):
+ """
+ Searches the table for the entry specified by name
+ and value
+
+ Returns one of the following:
+ - ``None``, no match at all
+ - ``(index, name, None)`` for partial matches on name only.
+ - ``(index, name, value)`` for perfect matches.
+ """
+ partial = None
+
+ header_name_search_result = HeaderTable.STATIC_TABLE_MAPPING.get(name)
+ if header_name_search_result:
+ index = header_name_search_result[1].get(value)
+ if index is not None:
+ return index, name, value
+ else:
+ partial = (header_name_search_result[0], name, None)
+
+ offset = HeaderTable.STATIC_TABLE_LENGTH + 1
+ for (i, (n, v)) in enumerate(self.dynamic_entries):
+ if n == name:
+ if v == value:
+ return i + offset, n, v
+ elif partial is None:
+ partial = (i + offset, n, None)
+ return partial
+
+ @property
+ def maxsize(self):
+ return self._maxsize
+
+ @maxsize.setter
+ def maxsize(self, newmax):
+ newmax = int(newmax)
+ log.debug("Resizing header table to %d from %d", newmax, self._maxsize)
+ oldmax = self._maxsize
+ self._maxsize = newmax
+ self.resized = (newmax != oldmax)
+ if newmax <= 0:
+ self.dynamic_entries.clear()
+ self._current_size = 0
+ elif oldmax > newmax:
+ self._shrink()
+
+ def _shrink(self):
+ """
+ Shrinks the dynamic table to be at or below maxsize
+ """
+ cursize = self._current_size
+ while cursize > self._maxsize:
+ name, value = self.dynamic_entries.pop()
+ cursize -= table_entry_size(name, value)
+ log.debug("Evicting %s: %s from the header table", name, value)
+ self._current_size = cursize
+
+
+def _build_static_table_mapping():
+ """
+ Build static table mapping from header name to tuple with next structure:
+ (, ).
+
+ static_table_mapping used for hash searching.
+ """
+ static_table_mapping = {}
+ for index, (name, value) in enumerate(HeaderTable.STATIC_TABLE, 1):
+ header_name_search_result = static_table_mapping.setdefault(name, (index, {}))
+ header_name_search_result[1][value] = index
+ return static_table_mapping
+
+
+HeaderTable.STATIC_TABLE_MAPPING = _build_static_table_mapping()
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/__about__.py b/.venv/lib/python3.9/site-packages/hypercorn/__about__.py
new file mode 100644
index 0000000..e2bd072
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/__about__.py
@@ -0,0 +1 @@
+__version__ = "0.11.2"
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/__init__.py b/.venv/lib/python3.9/site-packages/hypercorn/__init__.py
new file mode 100644
index 0000000..a9b2357
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/__init__.py
@@ -0,0 +1,4 @@
+from .__about__ import __version__
+from .config import Config
+
+__all__ = ("__version__", "Config")
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/__main__.py b/.venv/lib/python3.9/site-packages/hypercorn/__main__.py
new file mode 100644
index 0000000..2ae21d0
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/__main__.py
@@ -0,0 +1,271 @@
+import argparse
+import ssl
+import sys
+import warnings
+from typing import List, Optional
+
+from .config import Config
+from .run import run
+
+sentinel = object()
+
+
+def _load_config(config_path: Optional[str]) -> Config:
+ if config_path is None:
+ return Config()
+ elif config_path.startswith("python:"):
+ return Config.from_object(config_path[len("python:") :])
+ elif config_path.startswith("file:"):
+ return Config.from_pyfile(config_path[len("file:") :])
+ else:
+ return Config.from_toml(config_path)
+
+
+def main(sys_args: Optional[List[str]] = None) -> None:
+ parser = argparse.ArgumentParser()
+ parser.add_argument(
+ "application", help="The application to dispatch to as path.to.module:instance.path"
+ )
+ parser.add_argument("--access-log", help="Deprecated, see access-logfile", default=sentinel)
+ parser.add_argument(
+ "--access-logfile",
+ help="The target location for the access log, use `-` for stdout",
+ default=sentinel,
+ )
+ parser.add_argument(
+ "--access-logformat",
+ help="The log format for the access log, see help docs",
+ default=sentinel,
+ )
+ parser.add_argument(
+ "--backlog", help="The maximum number of pending connections", type=int, default=sentinel
+ )
+ parser.add_argument(
+ "-b",
+ "--bind",
+ dest="binds",
+ help=""" The TCP host/address to bind to. Should be either host:port, host,
+ unix:path or fd://num, e.g. 127.0.0.1:5000, 127.0.0.1,
+ unix:/tmp/socket or fd://33 respectively. """,
+ default=[],
+ action="append",
+ )
+ parser.add_argument("--ca-certs", help="Path to the SSL CA certificate file", default=sentinel)
+ parser.add_argument("--certfile", help="Path to the SSL certificate file", default=sentinel)
+ parser.add_argument("--cert-reqs", help="See verify mode argument", type=int, default=sentinel)
+ parser.add_argument("--ciphers", help="Ciphers to use for the SSL setup", default=sentinel)
+ parser.add_argument(
+ "-c",
+ "--config",
+ help="Location of a TOML config file, or when prefixed with `file:` a Python file, or when prefixed with `python:` a Python module.", # noqa: E501
+ default=None,
+ )
+ parser.add_argument(
+ "--debug",
+ help="Enable debug mode, i.e. extra logging and checks",
+ action="store_true",
+ default=sentinel,
+ )
+ parser.add_argument("--error-log", help="Deprecated, see error-logfile", default=sentinel)
+ parser.add_argument(
+ "--error-logfile",
+ "--log-file",
+ dest="error_logfile",
+ help="The target location for the error log, use `-` for stderr",
+ default=sentinel,
+ )
+ parser.add_argument(
+ "--graceful-timeout",
+ help="""Time to wait after SIGTERM or Ctrl-C for any remaining requests (tasks)
+ to complete.""",
+ default=sentinel,
+ type=int,
+ )
+ parser.add_argument(
+ "-g", "--group", help="Group to own any unix sockets.", default=sentinel, type=int
+ )
+ parser.add_argument(
+ "-k",
+ "--worker-class",
+ dest="worker_class",
+ help="The type of worker to use. "
+ "Options include asyncio, uvloop (pip install hypercorn[uvloop]), "
+ "and trio (pip install hypercorn[trio]).",
+ default=sentinel,
+ )
+ parser.add_argument(
+ "--keep-alive",
+ help="Seconds to keep inactive connections alive for",
+ default=sentinel,
+ type=int,
+ )
+ parser.add_argument("--keyfile", help="Path to the SSL key file", default=sentinel)
+ parser.add_argument(
+ "--insecure-bind",
+ dest="insecure_binds",
+ help="""The TCP host/address to bind to. SSL options will not apply to these binds.
+ See *bind* for formatting options. Care must be taken! See HTTP -> HTTPS redirection docs.
+ """,
+ default=[],
+ action="append",
+ )
+ parser.add_argument(
+ "--log-config", help="A Python logging configuration file.", default=sentinel
+ )
+ parser.add_argument(
+ "--log-level", help="The (error) log level, defaults to info", default="INFO"
+ )
+ parser.add_argument(
+ "-p", "--pid", help="Location to write the PID (Program ID) to.", default=sentinel
+ )
+ parser.add_argument(
+ "--quic-bind",
+ dest="quic_binds",
+ help="""The UDP/QUIC host/address to bind to. See *bind* for formatting
+ options.
+ """,
+ default=[],
+ action="append",
+ )
+ parser.add_argument(
+ "--reload",
+ help="Enable automatic reloads on code changes",
+ action="store_true",
+ default=sentinel,
+ )
+ parser.add_argument(
+ "--root-path", help="The setting for the ASGI root_path variable", default=sentinel
+ )
+ parser.add_argument(
+ "--server-name",
+ dest="server_names",
+ help="""The hostnames that can be served, requests to different hosts
+ will be responded to with 404s.
+ """,
+ default=[],
+ action="append",
+ )
+ parser.add_argument(
+ "--statsd-host", help="The host:port of the statsd server", default=sentinel
+ )
+ parser.add_argument("--statsd-prefix", help="Prefix for all statsd messages", default="")
+ parser.add_argument(
+ "-m",
+ "--umask",
+ help="The permissions bit mask to use on any unix sockets.",
+ default=sentinel,
+ type=int,
+ )
+ parser.add_argument(
+ "-u", "--user", help="User to own any unix sockets.", default=sentinel, type=int
+ )
+
+ def _convert_verify_mode(value: str) -> ssl.VerifyMode:
+ try:
+ return ssl.VerifyMode[value]
+ except KeyError:
+ raise argparse.ArgumentTypeError(f"'{value}' is not a valid verify mode")
+
+ parser.add_argument(
+ "--verify-mode",
+ help="SSL verify mode for peer's certificate, see ssl.VerifyMode enum for possible values.",
+ type=_convert_verify_mode,
+ default=sentinel,
+ )
+ parser.add_argument(
+ "--websocket-ping-interval",
+ help="""If set this is the time in seconds between pings sent to the client.
+ This can be used to keep the websocket connection alive.""",
+ default=sentinel,
+ type=int,
+ )
+ parser.add_argument(
+ "-w",
+ "--workers",
+ dest="workers",
+ help="The number of workers to spawn and use",
+ default=sentinel,
+ type=int,
+ )
+ args = parser.parse_args(sys_args or sys.argv[1:])
+ config = _load_config(args.config)
+ config.application_path = args.application
+ config.loglevel = args.log_level
+
+ if args.access_logformat is not sentinel:
+ config.access_log_format = args.access_logformat
+ if args.access_log is not sentinel:
+ warnings.warn(
+ "The --access-log argument is deprecated, use `--access-logfile` instead",
+ DeprecationWarning,
+ )
+ config.accesslog = args.access_log
+ if args.access_logfile is not sentinel:
+ config.accesslog = args.access_logfile
+ if args.backlog is not sentinel:
+ config.backlog = args.backlog
+ if args.ca_certs is not sentinel:
+ config.ca_certs = args.ca_certs
+ if args.certfile is not sentinel:
+ config.certfile = args.certfile
+ if args.cert_reqs is not sentinel:
+ config.cert_reqs = args.cert_reqs
+ if args.ciphers is not sentinel:
+ config.ciphers = args.ciphers
+ if args.debug is not sentinel:
+ config.debug = args.debug
+ if args.error_log is not sentinel:
+ warnings.warn(
+ "The --error-log argument is deprecated, use `--error-logfile` instead",
+ DeprecationWarning,
+ )
+ config.errorlog = args.error_log
+ if args.error_logfile is not sentinel:
+ config.errorlog = args.error_logfile
+ if args.graceful_timeout is not sentinel:
+ config.graceful_timeout = args.graceful_timeout
+ if args.group is not sentinel:
+ config.group = args.group
+ if args.keep_alive is not sentinel:
+ config.keep_alive_timeout = args.keep_alive
+ if args.keyfile is not sentinel:
+ config.keyfile = args.keyfile
+ if args.log_config is not sentinel:
+ config.logconfig = args.log_config
+ if args.pid is not sentinel:
+ config.pid_path = args.pid
+ if args.root_path is not sentinel:
+ config.root_path = args.root_path
+ if args.reload is not sentinel:
+ config.use_reloader = args.reload
+ if args.statsd_host is not sentinel:
+ config.statsd_host = args.statsd_host
+ if args.statsd_prefix is not sentinel:
+ config.statsd_prefix = args.statsd_prefix
+ if args.umask is not sentinel:
+ config.umask = args.umask
+ if args.user is not sentinel:
+ config.user = args.user
+ if args.worker_class is not sentinel:
+ config.worker_class = args.worker_class
+ if args.verify_mode is not sentinel:
+ config.verify_mode = args.verify_mode
+ if args.websocket_ping_interval is not sentinel:
+ config.websocket_ping_interval = args.websocket_ping_interval
+ if args.workers is not sentinel:
+ config.workers = args.workers
+
+ if len(args.binds) > 0:
+ config.bind = args.binds
+ if len(args.insecure_binds) > 0:
+ config.insecure_bind = args.insecure_binds
+ if len(args.quic_binds) > 0:
+ config.quic_bind = args.quic_binds
+ if len(args.server_names) > 0:
+ config.server_names = args.server_names
+
+ run(config)
+
+
+if __name__ == "__main__":
+ main()
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/__about__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/__about__.cpython-39.pyc
new file mode 100644
index 0000000..d4f313d
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/__about__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..e84697e
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/__main__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/__main__.cpython-39.pyc
new file mode 100644
index 0000000..c9dc2a4
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/__main__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/config.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/config.cpython-39.pyc
new file mode 100644
index 0000000..f69f19a
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/config.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/events.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/events.cpython-39.pyc
new file mode 100644
index 0000000..dcb67b2
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/events.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/logging.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/logging.cpython-39.pyc
new file mode 100644
index 0000000..d79b7c2
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/logging.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/run.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/run.cpython-39.pyc
new file mode 100644
index 0000000..982a3ac
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/run.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/statsd.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/statsd.cpython-39.pyc
new file mode 100644
index 0000000..890e86d
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/statsd.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/typing.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/typing.cpython-39.pyc
new file mode 100644
index 0000000..799b62c
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/typing.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/utils.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/utils.cpython-39.pyc
new file mode 100644
index 0000000..903e0c9
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/__pycache__/utils.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__init__.py b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__init__.py
new file mode 100644
index 0000000..475457a
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__init__.py
@@ -0,0 +1,39 @@
+import warnings
+from typing import Awaitable, Callable, Optional
+
+from .run import worker_serve
+from ..config import Config
+from ..typing import ASGIFramework
+
+
+async def serve(
+ app: ASGIFramework,
+ config: Config,
+ *,
+ shutdown_trigger: Optional[Callable[..., Awaitable[None]]] = None,
+) -> None:
+ """Serve an ASGI framework app given the config.
+
+ This allows for a programmatic way to serve an ASGI framework, it
+ can be used via,
+
+ .. code-block:: python
+
+ asyncio.run(serve(app, config))
+
+ It is assumed that the event-loop is configured before calling
+ this function, therefore configuration values that relate to loop
+ setup or process setup are ignored.
+
+ Arguments:
+ app: The ASGI application to serve.
+ config: A Hypercorn configuration object.
+ shutdown_trigger: This should return to trigger a graceful
+ shutdown.
+ """
+ if config.debug:
+ warnings.warn("The config `debug` has no affect when using serve", Warning)
+ if config.workers != 1:
+ warnings.warn("The config `workers` has no affect when using serve", Warning)
+
+ await worker_serve(app, config, shutdown_trigger=shutdown_trigger)
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..030d755
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/context.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/context.cpython-39.pyc
new file mode 100644
index 0000000..170e439
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/context.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/lifespan.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/lifespan.cpython-39.pyc
new file mode 100644
index 0000000..e82b147
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/lifespan.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/run.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/run.cpython-39.pyc
new file mode 100644
index 0000000..113fb86
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/run.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/statsd.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/statsd.cpython-39.pyc
new file mode 100644
index 0000000..8dd8a30
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/statsd.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/task_group.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/task_group.cpython-39.pyc
new file mode 100644
index 0000000..445de22
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/task_group.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/tcp_server.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/tcp_server.cpython-39.pyc
new file mode 100644
index 0000000..0d61d26
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/tcp_server.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/udp_server.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/udp_server.cpython-39.pyc
new file mode 100644
index 0000000..e9b7aff
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/__pycache__/udp_server.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/context.py b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/context.py
new file mode 100644
index 0000000..edc8ad0
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/context.py
@@ -0,0 +1,74 @@
+import asyncio
+from typing import Any, Awaitable, Callable, Optional, Type, Union
+
+from .task_group import TaskGroup
+from ..config import Config
+from ..typing import (
+ ASGIFramework,
+ ASGIReceiveCallable,
+ ASGIReceiveEvent,
+ ASGISendEvent,
+ Event,
+ Scope,
+)
+from ..utils import invoke_asgi
+
+
+class EventWrapper:
+ def __init__(self) -> None:
+ self._event = asyncio.Event()
+
+ async def clear(self) -> None:
+ self._event.clear()
+
+ async def wait(self) -> None:
+ await self._event.wait()
+
+ async def set(self) -> None:
+ self._event.set()
+
+
+async def _handle(
+ app: ASGIFramework,
+ config: Config,
+ scope: Scope,
+ receive: ASGIReceiveCallable,
+ send: Callable[[Optional[ASGISendEvent]], Awaitable[None]],
+) -> None:
+ try:
+ await invoke_asgi(app, scope, receive, send)
+ except asyncio.CancelledError:
+ raise
+ except Exception:
+ await config.log.exception("Error in ASGI Framework")
+ finally:
+ await send(None)
+
+
+class Context:
+ event_class: Type[Event] = EventWrapper
+
+ def __init__(self, task_group: TaskGroup) -> None:
+ self.task_group = task_group
+
+ async def spawn_app(
+ self,
+ app: ASGIFramework,
+ config: Config,
+ scope: Scope,
+ send: Callable[[Optional[ASGISendEvent]], Awaitable[None]],
+ ) -> Callable[[ASGIReceiveEvent], Awaitable[None]]:
+ app_queue: asyncio.Queue[ASGIReceiveEvent] = asyncio.Queue(config.max_app_queue_size)
+ self.task_group.spawn(_handle(app, config, scope, app_queue.get, send))
+ return app_queue.put
+
+ def spawn(self, func: Callable, *args: Any) -> None:
+ self.task_group.spawn(func(*args))
+
+ @staticmethod
+ async def sleep(wait: Union[float, int]) -> None:
+ return await asyncio.sleep(wait)
+
+ @staticmethod
+ def time() -> float:
+ return asyncio.get_event_loop().time()
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/lifespan.py b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/lifespan.py
new file mode 100644
index 0000000..100b4e3
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/lifespan.py
@@ -0,0 +1,85 @@
+import asyncio
+
+from ..config import Config
+from ..typing import ASGIFramework, ASGIReceiveEvent, ASGISendEvent, LifespanScope
+from ..utils import invoke_asgi, LifespanFailure, LifespanTimeout
+
+
+class UnexpectedMessage(Exception):
+ pass
+
+
+class Lifespan:
+ def __init__(self, app: ASGIFramework, config: Config) -> None:
+ self.app = app
+ self.config = config
+ self.startup = asyncio.Event()
+ self.shutdown = asyncio.Event()
+ self.app_queue: asyncio.Queue = asyncio.Queue(config.max_app_queue_size)
+ self.supported = True
+
+ # This mimics the Trio nursery.start task_status and is
+ # required to ensure the support has been checked before
+ # waiting on timeouts.
+ self._started = asyncio.Event()
+
+ async def handle_lifespan(self) -> None:
+ self._started.set()
+ scope: LifespanScope = {"type": "lifespan", "asgi": {"spec_version": "2.0"}}
+ try:
+ await invoke_asgi(self.app, scope, self.asgi_receive, self.asgi_send)
+ except LifespanFailure:
+ # Lifespan failures should crash the server
+ raise
+ except Exception:
+ self.supported = False
+ if not self.startup.is_set():
+ message = "ASGI Framework Lifespan error, continuing without Lifespan support"
+ elif not self.shutdown.is_set():
+ message = "ASGI Framework Lifespan error, shutdown without Lifespan support"
+ else:
+ message = "ASGI Framework Lifespan errored after shutdown."
+
+ await self.config.log.exception(message)
+ finally:
+ self.startup.set()
+ self.shutdown.set()
+
+ async def wait_for_startup(self) -> None:
+ await self._started.wait()
+ if not self.supported:
+ return
+
+ await self.app_queue.put({"type": "lifespan.startup"})
+ try:
+ await asyncio.wait_for(self.startup.wait(), timeout=self.config.startup_timeout)
+ except asyncio.TimeoutError as error:
+ raise LifespanTimeout("startup") from error
+
+ async def wait_for_shutdown(self) -> None:
+ await self._started.wait()
+ if not self.supported:
+ return
+
+ await self.app_queue.put({"type": "lifespan.shutdown"})
+ try:
+ await asyncio.wait_for(self.shutdown.wait(), timeout=self.config.shutdown_timeout)
+ except asyncio.TimeoutError as error:
+ raise LifespanTimeout("shutdown") from error
+
+ async def asgi_receive(self) -> ASGIReceiveEvent:
+ return await self.app_queue.get()
+
+ async def asgi_send(self, message: ASGISendEvent) -> None:
+ if message["type"] == "lifespan.startup.complete":
+ self.startup.set()
+ elif message["type"] == "lifespan.shutdown.complete":
+ self.shutdown.set()
+ elif message["type"] == "lifespan.startup.failed":
+ self.startup.set()
+ raise LifespanFailure("startup", message["message"])
+ elif message["type"] == "lifespan.shutdown.failed":
+ self.shutdown.set()
+ raise LifespanFailure("shutdown", message["message"])
+ else:
+ raise UnexpectedMessage(message["type"])
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/run.py b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/run.py
new file mode 100644
index 0000000..8c4d8fc
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/run.py
@@ -0,0 +1,266 @@
+import asyncio
+import platform
+import signal
+import ssl
+from functools import partial
+from multiprocessing.synchronize import Event as EventType
+from os import getpid
+from socket import socket
+from typing import Any, Awaitable, Callable, Optional
+
+from .lifespan import Lifespan
+from .statsd import StatsdLogger
+from .tcp_server import TCPServer
+from .udp_server import UDPServer
+from ..config import Config, Sockets
+from ..typing import ASGIFramework
+from ..utils import (
+ check_multiprocess_shutdown_event,
+ load_application,
+ MustReloadException,
+ observe_changes,
+ raise_shutdown,
+ repr_socket_addr,
+ restart,
+ Shutdown,
+)
+
+try:
+ from socket import AF_UNIX
+except ImportError:
+ AF_UNIX = None
+
+
+async def _windows_signal_support() -> None:
+ # See https://bugs.python.org/issue23057, to catch signals on
+ # Windows it is necessary for an IO event to happen periodically.
+ while True:
+ await asyncio.sleep(1)
+
+
+def _share_socket(sock: socket) -> socket:
+ # Windows requires the socket be explicitly shared across
+ # multiple workers (processes).
+ from socket import fromshare # type: ignore
+
+ sock_data = sock.share(getpid()) # type: ignore
+ return fromshare(sock_data)
+
+
+async def worker_serve(
+ app: ASGIFramework,
+ config: Config,
+ *,
+ sockets: Optional[Sockets] = None,
+ shutdown_trigger: Optional[Callable[..., Awaitable[None]]] = None,
+) -> None:
+ config.set_statsd_logger_class(StatsdLogger)
+
+ lifespan = Lifespan(app, config)
+ lifespan_task = asyncio.ensure_future(lifespan.handle_lifespan())
+
+ await lifespan.wait_for_startup()
+ if lifespan_task.done():
+ exception = lifespan_task.exception()
+ if exception is not None:
+ raise exception
+
+ if sockets is None:
+ sockets = config.create_sockets()
+
+ loop = asyncio.get_event_loop()
+ tasks = []
+ if platform.system() == "Windows":
+ tasks.append(loop.create_task(_windows_signal_support()))
+
+ if shutdown_trigger is None:
+ signal_event = asyncio.Event()
+
+ def _signal_handler(*_: Any) -> None: # noqa: N803
+ signal_event.set()
+
+ for signal_name in {"SIGINT", "SIGTERM", "SIGBREAK"}:
+ if hasattr(signal, signal_name):
+ try:
+ loop.add_signal_handler(getattr(signal, signal_name), _signal_handler)
+ except NotImplementedError:
+ # Add signal handler may not be implemented on Windows
+ signal.signal(getattr(signal, signal_name), _signal_handler)
+
+ shutdown_trigger = signal_event.wait # type: ignore
+
+ tasks.append(loop.create_task(raise_shutdown(shutdown_trigger)))
+
+ if config.use_reloader:
+ tasks.append(loop.create_task(observe_changes(asyncio.sleep)))
+
+ ssl_handshake_timeout = None
+ if config.ssl_enabled:
+ ssl_context = config.create_ssl_context()
+ ssl_handshake_timeout = config.ssl_handshake_timeout
+
+ async def _server_callback(reader: asyncio.StreamReader, writer: asyncio.StreamWriter) -> None:
+ await TCPServer(app, loop, config, reader, writer)
+
+ servers = []
+ for sock in sockets.secure_sockets:
+ if config.workers > 1 and platform.system() == "Windows":
+ sock = _share_socket(sock)
+
+ servers.append(
+ await asyncio.start_server(
+ _server_callback,
+ backlog=config.backlog,
+ loop=loop,
+ ssl=ssl_context,
+ sock=sock,
+ ssl_handshake_timeout=ssl_handshake_timeout,
+ )
+ )
+ bind = repr_socket_addr(sock.family, sock.getsockname())
+ await config.log.info(f"Running on https://{bind} (CTRL + C to quit)")
+
+ for sock in sockets.insecure_sockets:
+ if config.workers > 1 and platform.system() == "Windows":
+ sock = _share_socket(sock)
+
+ servers.append(
+ await asyncio.start_server(
+ _server_callback, backlog=config.backlog, loop=loop, sock=sock
+ )
+ )
+ bind = repr_socket_addr(sock.family, sock.getsockname())
+ await config.log.info(f"Running on http://{bind} (CTRL + C to quit)")
+
+ tasks.extend(server.serve_forever() for server in servers) # type: ignore
+
+ for sock in sockets.quic_sockets:
+ if config.workers > 1 and platform.system() == "Windows":
+ sock = _share_socket(sock)
+
+ await loop.create_datagram_endpoint(lambda: UDPServer(app, loop, config), sock=sock)
+ bind = repr_socket_addr(sock.family, sock.getsockname())
+ await config.log.info(f"Running on https://{bind} (QUIC) (CTRL + C to quit)")
+
+ reload_ = False
+ try:
+ gathered_tasks = asyncio.gather(*tasks)
+ await gathered_tasks
+ except MustReloadException:
+ reload_ = True
+ except (Shutdown, KeyboardInterrupt):
+ pass
+ finally:
+ for server in servers:
+ server.close()
+ await server.wait_closed()
+
+ try:
+ await asyncio.sleep(config.graceful_timeout)
+ except (Shutdown, KeyboardInterrupt):
+ pass
+
+ # Retrieve the Gathered Tasks Cancelled Exception, to
+ # prevent a warning that this hasn't been done.
+ gathered_tasks.exception()
+
+ await lifespan.wait_for_shutdown()
+ lifespan_task.cancel()
+ await lifespan_task
+
+ if reload_:
+ restart()
+
+
+def asyncio_worker(
+ config: Config, sockets: Optional[Sockets] = None, shutdown_event: Optional[EventType] = None
+) -> None:
+ app = load_application(config.application_path)
+
+ shutdown_trigger = None
+ if shutdown_event is not None:
+ shutdown_trigger = partial(check_multiprocess_shutdown_event, shutdown_event, asyncio.sleep)
+
+ if config.workers > 1 and platform.system() == "Windows":
+ asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy()) # type: ignore
+
+ _run(
+ partial(worker_serve, app, config, sockets=sockets),
+ debug=config.debug,
+ shutdown_trigger=shutdown_trigger,
+ )
+
+
+def uvloop_worker(
+ config: Config, sockets: Optional[Sockets] = None, shutdown_event: Optional[EventType] = None
+) -> None:
+ try:
+ import uvloop
+ except ImportError as error:
+ raise Exception("uvloop is not installed") from error
+ else:
+ asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
+
+ app = load_application(config.application_path)
+
+ shutdown_trigger = None
+ if shutdown_event is not None:
+ shutdown_trigger = partial(check_multiprocess_shutdown_event, shutdown_event, asyncio.sleep)
+
+ _run(
+ partial(worker_serve, app, config, sockets=sockets),
+ debug=config.debug,
+ shutdown_trigger=shutdown_trigger,
+ )
+
+
+def _run(
+ main: Callable,
+ *,
+ debug: bool = False,
+ shutdown_trigger: Optional[Callable[..., Awaitable[None]]] = None,
+) -> None:
+ loop = asyncio.new_event_loop()
+ asyncio.set_event_loop(loop)
+ loop.set_debug(debug)
+ loop.set_exception_handler(_exception_handler)
+
+ try:
+ loop.run_until_complete(main(shutdown_trigger=shutdown_trigger))
+ except KeyboardInterrupt:
+ pass
+ finally:
+ try:
+ _cancel_all_tasks(loop)
+ loop.run_until_complete(loop.shutdown_asyncgens())
+ finally:
+ asyncio.set_event_loop(None)
+ loop.close()
+
+
+def _cancel_all_tasks(loop: asyncio.AbstractEventLoop) -> None:
+ tasks = [task for task in asyncio.all_tasks(loop) if not task.done()]
+ if not tasks:
+ return
+
+ for task in tasks:
+ task.cancel()
+ loop.run_until_complete(asyncio.gather(*tasks, loop=loop, return_exceptions=True))
+
+ for task in tasks:
+ if not task.cancelled() and task.exception() is not None:
+ loop.call_exception_handler(
+ {
+ "message": "unhandled exception during shutdown",
+ "exception": task.exception(),
+ "task": task,
+ }
+ )
+
+
+def _exception_handler(loop: asyncio.AbstractEventLoop, context: dict) -> None:
+ exception = context.get("exception")
+ if isinstance(exception, ssl.SSLError):
+ pass # Handshake failure
+ else:
+ loop.default_exception_handler(context)
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/statsd.py b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/statsd.py
new file mode 100644
index 0000000..cc50df2
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/statsd.py
@@ -0,0 +1,24 @@
+import asyncio
+from typing import Optional
+
+from ..config import Config
+from ..statsd import StatsdLogger as Base
+
+
+class _DummyProto(asyncio.DatagramProtocol):
+ pass
+
+
+class StatsdLogger(Base):
+ def __init__(self, config: Config) -> None:
+ super().__init__(config)
+ self.address = config.statsd_host.rsplit(":", 1)
+ self.transport: Optional[asyncio.BaseTransport] = None
+
+ async def _socket_send(self, message: bytes) -> None:
+ if self.transport is None:
+ self.transport, _ = await asyncio.get_event_loop().create_datagram_endpoint(
+ _DummyProto, remote_addr=(self.address[0], int(self.address[1]))
+ )
+
+ self.transport.sendto(message) # type: ignore
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/task_group.py b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/task_group.py
new file mode 100644
index 0000000..8feceab
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/task_group.py
@@ -0,0 +1,34 @@
+import asyncio
+import weakref
+from types import TracebackType
+from typing import Coroutine
+
+
+class TaskGroup:
+ def __init__(self, loop: asyncio.AbstractEventLoop) -> None:
+ self._loop = loop
+ self._tasks: weakref.WeakSet = weakref.WeakSet()
+
+ def spawn(self, coro: Coroutine) -> None:
+ self._tasks.add(self._loop.create_task(coro))
+
+ async def __aenter__(self) -> "TaskGroup":
+ return self
+
+ async def __aexit__(self, exc_type: type, exc_value: BaseException, tb: TracebackType) -> None:
+ if exc_type is not None:
+ self._cancel_tasks()
+
+ try:
+ task = asyncio.gather(*self._tasks)
+ await task
+ finally:
+ task.cancel()
+ try:
+ await task
+ except asyncio.CancelledError:
+ pass
+
+ def _cancel_tasks(self) -> None:
+ for task in self._tasks:
+ task.cancel()
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/tcp_server.py b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/tcp_server.py
new file mode 100644
index 0000000..afd3d59
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/tcp_server.py
@@ -0,0 +1,153 @@
+import asyncio
+from typing import Any, Callable, cast, Generator, Optional
+
+from .context import Context
+from .task_group import TaskGroup
+from ..config import Config
+from ..events import Closed, Event, RawData, Updated
+from ..protocol import ProtocolWrapper
+from ..typing import ASGIFramework
+from ..utils import parse_socket_addr
+
+MAX_RECV = 2 ** 16
+
+
+class EventWrapper:
+ def __init__(self) -> None:
+ self._event = asyncio.Event()
+
+ async def clear(self) -> None:
+ self._event.clear()
+
+ async def wait(self) -> None:
+ await self._event.wait()
+
+ async def set(self) -> None:
+ self._event.set()
+
+
+class TCPServer:
+ def __init__(
+ self,
+ app: ASGIFramework,
+ loop: asyncio.AbstractEventLoop,
+ config: Config,
+ reader: asyncio.StreamReader,
+ writer: asyncio.StreamWriter,
+ ) -> None:
+ self.app = app
+ self.config = config
+ self.loop = loop
+ self.protocol: ProtocolWrapper
+ self.reader = reader
+ self.writer = writer
+ self.send_lock = asyncio.Lock()
+ self.timeout_lock = asyncio.Lock()
+
+ self._keep_alive_timeout_handle: Optional[asyncio.Task] = None
+
+ def __await__(self) -> Generator[Any, None, None]:
+ return self.run().__await__()
+
+ async def run(self) -> None:
+ socket = self.writer.get_extra_info("socket")
+ try:
+ client = parse_socket_addr(socket.family, socket.getpeername())
+ server = parse_socket_addr(socket.family, socket.getsockname())
+ ssl_object = self.writer.get_extra_info("ssl_object")
+ if ssl_object is not None:
+ ssl = True
+ alpn_protocol = ssl_object.selected_alpn_protocol()
+ else:
+ ssl = False
+ alpn_protocol = "http/1.1"
+
+ async with TaskGroup(self.loop) as task_group:
+ context = Context(task_group)
+ self.protocol = ProtocolWrapper(
+ self.app,
+ self.config,
+ cast(Any, context),
+ ssl,
+ client,
+ server,
+ self.protocol_send,
+ alpn_protocol,
+ )
+ await self.protocol.initiate()
+ await self._update_keep_alive_timeout()
+ await self._read_data()
+ except OSError:
+ pass
+ finally:
+ await self._close()
+
+ async def protocol_send(self, event: Event) -> None:
+ if isinstance(event, RawData):
+ async with self.send_lock:
+ try:
+ self.writer.write(event.data)
+ await self.writer.drain()
+ except ConnectionError:
+ await self.protocol.handle(Closed())
+ elif isinstance(event, Closed):
+ await self._close()
+ await self.protocol.handle(Closed())
+ elif isinstance(event, Updated):
+ pass # Triggers the keep alive timeout update
+ await self._update_keep_alive_timeout()
+
+ async def _read_data(self) -> None:
+ while True:
+ try:
+ data = await self.reader.read(MAX_RECV)
+ except (
+ ConnectionError,
+ OSError,
+ asyncio.TimeoutError,
+ TimeoutError,
+ ):
+ await self.protocol.handle(Closed())
+ break
+ else:
+ if data == b"":
+ await self._update_keep_alive_timeout()
+ break
+ await self.protocol.handle(RawData(data))
+ await self._update_keep_alive_timeout()
+
+ async def _close(self) -> None:
+ try:
+ self.writer.write_eof()
+ except (NotImplementedError, OSError, RuntimeError):
+ pass # Likely SSL connection
+
+ try:
+ self.writer.close()
+ await self.writer.wait_closed()
+ except (BrokenPipeError, ConnectionResetError):
+ pass # Already closed
+
+ async def _update_keep_alive_timeout(self) -> None:
+ async with self.timeout_lock:
+ if self._keep_alive_timeout_handle is not None:
+ self._keep_alive_timeout_handle.cancel()
+ try:
+ await self._keep_alive_timeout_handle
+ except asyncio.CancelledError:
+ pass
+
+ self._keep_alive_timeout_handle = None
+ if self.protocol.idle:
+ self._keep_alive_timeout_handle = self.loop.create_task(
+ _call_later(self.config.keep_alive_timeout, self._timeout)
+ )
+
+ async def _timeout(self) -> None:
+ await self.protocol.handle(Closed())
+ self.writer.close()
+
+
+async def _call_later(timeout: float, callback: Callable) -> None:
+ await asyncio.sleep(timeout)
+ await asyncio.shield(callback())
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/asyncio/udp_server.py b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/udp_server.py
new file mode 100644
index 0000000..430e992
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/asyncio/udp_server.py
@@ -0,0 +1,55 @@
+import asyncio
+from typing import Any, cast, Optional, Tuple, TYPE_CHECKING
+
+from .context import Context
+from .task_group import TaskGroup
+from ..config import Config
+from ..events import Closed, Event, RawData
+from ..typing import ASGIFramework
+from ..utils import parse_socket_addr
+
+if TYPE_CHECKING:
+ # h3/Quic is an optional part of Hypercorn
+ from ..protocol.quic import QuicProtocol # noqa: F401
+
+
+class UDPServer(asyncio.DatagramProtocol):
+ def __init__(self, app: ASGIFramework, loop: asyncio.AbstractEventLoop, config: Config) -> None:
+ self.app = app
+ self.config = config
+ self.loop = loop
+ self.protocol: "QuicProtocol"
+ self.protocol_queue: asyncio.Queue = asyncio.Queue(10)
+ self.transport: Optional[asyncio.DatagramTransport] = None
+
+ self.loop.create_task(self._consume_events())
+
+ def connection_made(self, transport: asyncio.DatagramTransport) -> None: # type: ignore
+ # h3/Quic is an optional part of Hypercorn
+ from ..protocol.quic import QuicProtocol # noqa: F811
+
+ self.transport = transport
+ socket = self.transport.get_extra_info("socket")
+ server = parse_socket_addr(socket.family, socket.getsockname())
+ task_group = TaskGroup(self.loop)
+ context = Context(task_group)
+ self.protocol = QuicProtocol(
+ self.app, self.config, cast(Any, context), server, self.protocol_send
+ )
+
+ def datagram_received(self, data: bytes, address: Tuple[bytes, str]) -> None: # type: ignore
+ try:
+ self.protocol_queue.put_nowait(RawData(data=data, address=address)) # type: ignore
+ except asyncio.QueueFull:
+ pass # Just throw the data away, is UDP
+
+ async def protocol_send(self, event: Event) -> None:
+ if isinstance(event, RawData):
+ self.transport.sendto(event.data, event.address)
+
+ async def _consume_events(self) -> None:
+ while True:
+ event = await self.protocol_queue.get()
+ await self.protocol.handle(event)
+ if isinstance(event, Closed):
+ break
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/config.py b/.venv/lib/python3.9/site-packages/hypercorn/config.py
new file mode 100644
index 0000000..46c13c7
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/config.py
@@ -0,0 +1,373 @@
+import importlib
+import importlib.util
+import logging
+import os
+import socket
+import ssl
+import stat
+import types
+import warnings
+from dataclasses import dataclass
+from ssl import SSLContext, VerifyFlags, VerifyMode
+from time import time
+from typing import Any, AnyStr, Dict, List, Mapping, Optional, Tuple, Type, Union
+from wsgiref.handlers import format_date_time
+
+import toml
+
+from .logging import Logger
+
+BYTES = 1
+OCTETS = 1
+SECONDS = 1.0
+
+FilePath = Union[AnyStr, os.PathLike]
+SocketKind = Union[int, socket.SocketKind]
+
+
+@dataclass
+class Sockets:
+ secure_sockets: List[socket.socket]
+ insecure_sockets: List[socket.socket]
+ quic_sockets: List[socket.socket]
+
+
+class SocketTypeError(Exception):
+ def __init__(self, expected: SocketKind, actual: SocketKind) -> None:
+ super().__init__(
+ f'Unexpected socket type, wanted "{socket.SocketKind(expected)}" got '
+ f'"{socket.SocketKind(actual)}"'
+ )
+
+
+class Config:
+ _bind = ["127.0.0.1:8000"]
+ _insecure_bind: List[str] = []
+ _quic_bind: List[str] = []
+ _quic_addresses: List[Tuple] = []
+ _log: Optional[Logger] = None
+
+ access_log_format = '%(h)s %(l)s %(l)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"'
+ accesslog: Union[logging.Logger, str, None] = None
+ alpn_protocols = ["h2", "http/1.1"]
+ alt_svc_headers: List[str] = []
+ application_path: str
+ backlog = 100
+ ca_certs: Optional[str] = None
+ certfile: Optional[str] = None
+ ciphers: str = "ECDHE+AESGCM"
+ debug = False
+ dogstatsd_tags = ""
+ errorlog: Union[logging.Logger, str, None] = "-"
+ graceful_timeout: float = 3 * SECONDS
+ group: Optional[int] = None
+ h11_max_incomplete_size = 16 * 1024 * BYTES
+ h2_max_concurrent_streams = 100
+ h2_max_header_list_size = 2 ** 16
+ h2_max_inbound_frame_size = 2 ** 14 * OCTETS
+ include_server_header = True
+ keep_alive_timeout = 5 * SECONDS
+ keyfile: Optional[str] = None
+ logconfig: Optional[str] = None
+ logconfig_dict: Optional[dict] = None
+ logger_class = Logger
+ loglevel: str = "INFO"
+ max_app_queue_size: int = 10
+ pid_path: Optional[str] = None
+ root_path = ""
+ server_names: List[str] = []
+ shutdown_timeout = 60 * SECONDS
+ ssl_handshake_timeout = 60 * SECONDS
+ startup_timeout = 60 * SECONDS
+ statsd_host: Optional[str] = None
+ statsd_prefix = ""
+ umask: Optional[int] = None
+ use_reloader = False
+ user: Optional[int] = None
+ verify_flags: Optional[VerifyFlags] = None
+ verify_mode: Optional[VerifyMode] = None
+ websocket_max_message_size = 16 * 1024 * 1024 * BYTES
+ websocket_ping_interval: Optional[int] = None
+ worker_class = "asyncio"
+ workers = 1
+
+ def set_cert_reqs(self, value: int) -> None:
+ warnings.warn("Please use verify_mode instead", Warning)
+ self.verify_mode = VerifyMode(value)
+
+ cert_reqs = property(None, set_cert_reqs)
+
+ @property
+ def log(self) -> Logger:
+ if self._log is None:
+ self._log = self.logger_class(self)
+ return self._log
+
+ @property
+ def bind(self) -> List[str]:
+ return self._bind
+
+ @bind.setter
+ def bind(self, value: Union[List[str], str]) -> None:
+ if isinstance(value, str):
+ self._bind = [value]
+ else:
+ self._bind = value
+
+ @property
+ def insecure_bind(self) -> List[str]:
+ return self._insecure_bind
+
+ @insecure_bind.setter
+ def insecure_bind(self, value: Union[List[str], str]) -> None:
+ if isinstance(value, str):
+ self._insecure_bind = [value]
+ else:
+ self._insecure_bind = value
+
+ @property
+ def quic_bind(self) -> List[str]:
+ return self._quic_bind
+
+ @quic_bind.setter
+ def quic_bind(self, value: Union[List[str], str]) -> None:
+ if isinstance(value, str):
+ self._quic_bind = [value]
+ else:
+ self._quic_bind = value
+
+ def create_ssl_context(self) -> Optional[SSLContext]:
+ if not self.ssl_enabled:
+ return None
+
+ context = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)
+ context.set_ciphers(self.ciphers)
+ cipher_opts = 0
+ for attr in ["OP_NO_SSLv2", "OP_NO_SSLv3", "OP_NO_TLSv1", "OP_NO_TLSv1_1"]:
+ if hasattr(ssl, attr): # To be future proof
+ cipher_opts |= getattr(ssl, attr)
+ context.options |= cipher_opts # RFC 7540 Section 9.2: MUST be TLS >=1.2
+ context.options |= ssl.OP_NO_COMPRESSION # RFC 7540 Section 9.2.1: MUST disable compression
+ context.set_alpn_protocols(self.alpn_protocols)
+
+ if self.certfile is not None and self.keyfile is not None:
+ context.load_cert_chain(certfile=self.certfile, keyfile=self.keyfile)
+
+ if self.ca_certs is not None:
+ context.load_verify_locations(self.ca_certs)
+ if self.verify_mode is not None:
+ context.verify_mode = self.verify_mode
+ if self.verify_flags is not None:
+ context.verify_flags = self.verify_flags
+
+ return context
+
+ @property
+ def ssl_enabled(self) -> bool:
+ return self.certfile is not None and self.keyfile is not None
+
+ def create_sockets(self) -> Sockets:
+ if self.ssl_enabled:
+ secure_sockets = self._create_sockets(self.bind)
+ insecure_sockets = self._create_sockets(self.insecure_bind)
+ quic_sockets = self._create_sockets(self.quic_bind, socket.SOCK_DGRAM)
+ self._set_quic_addresses(quic_sockets)
+ else:
+ secure_sockets = []
+ insecure_sockets = self._create_sockets(self.bind)
+ quic_sockets = []
+ return Sockets(secure_sockets, insecure_sockets, quic_sockets)
+
+ def _set_quic_addresses(self, sockets: List[socket.socket]) -> None:
+ self._quic_addresses = []
+ for sock in sockets:
+ name = sock.getsockname()
+ if type(name) is not str and len(name) >= 2:
+ self._quic_addresses.append(name)
+ else:
+ warnings.warn(
+ f'Cannot create a alt-svc header for the QUIC socket with address "{name}"',
+ Warning,
+ )
+
+ def _create_sockets(
+ self, binds: List[str], type_: int = socket.SOCK_STREAM
+ ) -> List[socket.socket]:
+ sockets: List[socket.socket] = []
+ for bind in binds:
+ binding: Any = None
+ if bind.startswith("unix:"):
+ sock = socket.socket(socket.AF_UNIX, type_)
+ binding = bind[5:]
+ try:
+ if stat.S_ISSOCK(os.stat(binding).st_mode):
+ os.remove(binding)
+ except FileNotFoundError:
+ pass
+ elif bind.startswith("fd://"):
+ sock = socket.socket(fileno=int(bind[5:]))
+ actual_type = sock.getsockopt(socket.SOL_SOCKET, socket.SO_TYPE)
+ if actual_type != type_:
+ raise SocketTypeError(type_, actual_type)
+ else:
+ bind = bind.replace("[", "").replace("]", "")
+ try:
+ value = bind.rsplit(":", 1)
+ host, port = value[0], int(value[1])
+ except (ValueError, IndexError):
+ host, port = bind, 8000
+ sock = socket.socket(socket.AF_INET6 if ":" in host else socket.AF_INET, type_)
+ if self.workers > 1:
+ try:
+ sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
+ except AttributeError:
+ pass
+ binding = (host, port)
+
+ sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
+
+ if bind.startswith("unix:"):
+ if self.umask is not None:
+ current_umask = os.umask(self.umask)
+ sock.bind(binding)
+ if self.user is not None and self.group is not None:
+ os.chown(binding, self.user, self.group)
+ if self.umask is not None:
+ os.umask(current_umask)
+ elif bind.startswith("fd://"):
+ pass
+ else:
+ sock.bind(binding)
+
+ sock.setblocking(False)
+ try:
+ sock.set_inheritable(True)
+ except AttributeError:
+ pass
+ sockets.append(sock)
+ return sockets
+
+ def response_headers(self, protocol: str) -> List[Tuple[bytes, bytes]]:
+ headers = [(b"date", format_date_time(time()).encode("ascii"))]
+ if self.include_server_header:
+ headers.append((b"server", f"hypercorn-{protocol}".encode("ascii")))
+
+ for alt_svc_header in self.alt_svc_headers:
+ headers.append((b"alt-svc", alt_svc_header.encode()))
+ if len(self.alt_svc_headers) == 0 and self._quic_addresses:
+ from aioquic.h3.connection import H3_ALPN
+
+ for version in H3_ALPN:
+ for addr in self._quic_addresses:
+ port = addr[1]
+ headers.append((b"alt-svc", b'%s=":%d"; ma=3600' % (version.encode(), port)))
+
+ return headers
+
+ def set_statsd_logger_class(self, statsd_logger: Type[Logger]) -> None:
+ if self.logger_class == Logger and self.statsd_host is not None:
+ self.logger_class = statsd_logger
+
+ @classmethod
+ def from_mapping(
+ cls: Type["Config"], mapping: Optional[Mapping[str, Any]] = None, **kwargs: Any
+ ) -> "Config":
+ """Create a configuration from a mapping.
+
+ This allows either a mapping to be directly passed or as
+ keyword arguments, for example,
+
+ .. code-block:: python
+
+ config = {'keep_alive_timeout': 10}
+ Config.from_mapping(config)
+ Config.from_mapping(keep_alive_timeout=10)
+
+ Arguments:
+ mapping: Optionally a mapping object.
+ kwargs: Optionally a collection of keyword arguments to
+ form a mapping.
+ """
+ mappings: Dict[str, Any] = {}
+ if mapping is not None:
+ mappings.update(mapping)
+ mappings.update(kwargs)
+ config = cls()
+ for key, value in mappings.items():
+ try:
+ setattr(config, key, value)
+ except AttributeError:
+ pass
+
+ return config
+
+ @classmethod
+ def from_pyfile(cls: Type["Config"], filename: FilePath) -> "Config":
+ """Create a configuration from a Python file.
+
+ .. code-block:: python
+
+ Config.from_pyfile('hypercorn_config.py')
+
+ Arguments:
+ filename: The filename which gives the path to the file.
+ """
+ file_path = os.fspath(filename)
+ spec = importlib.util.spec_from_file_location("module.name", file_path)
+ module = importlib.util.module_from_spec(spec)
+ spec.loader.exec_module(module) # type: ignore
+ return cls.from_object(module)
+
+ @classmethod
+ def from_toml(cls: Type["Config"], filename: FilePath) -> "Config":
+ """Load the configuration values from a TOML formatted file.
+
+ This allows configuration to be loaded as so
+
+ .. code-block:: python
+
+ Config.from_toml('config.toml')
+
+ Arguments:
+ filename: The filename which gives the path to the file.
+ """
+ file_path = os.fspath(filename)
+ with open(file_path) as file_:
+ data = toml.load(file_)
+ return cls.from_mapping(data)
+
+ @classmethod
+ def from_object(cls: Type["Config"], instance: Union[object, str]) -> "Config":
+ """Create a configuration from a Python object.
+
+ This can be used to reference modules or objects within
+ modules for example,
+
+ .. code-block:: python
+
+ Config.from_object('module')
+ Config.from_object('module.instance')
+ from module import instance
+ Config.from_object(instance)
+
+ are valid.
+
+ Arguments:
+ instance: Either a str referencing a python object or the
+ object itself.
+
+ """
+ if isinstance(instance, str):
+ try:
+ instance = importlib.import_module(instance)
+ except ImportError:
+ path, config = instance.rsplit(".", 1) # type: ignore
+ module = importlib.import_module(path)
+ instance = getattr(module, config)
+
+ mapping = {
+ key: getattr(instance, key)
+ for key in dir(instance)
+ if not isinstance(getattr(instance, key), types.ModuleType)
+ }
+ return cls.from_mapping(mapping)
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/events.py b/.venv/lib/python3.9/site-packages/hypercorn/events.py
new file mode 100644
index 0000000..2e53513
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/events.py
@@ -0,0 +1,25 @@
+from abc import ABC
+from dataclasses import dataclass
+from typing import Optional, Tuple
+
+
+class Event(ABC):
+ pass
+
+
+@dataclass(frozen=True)
+class RawData(Event):
+ data: bytes
+ address: Optional[Tuple[str, int]] = None
+
+
+@dataclass(frozen=True)
+class Closed(Event):
+ pass
+
+
+@dataclass(frozen=True)
+class Updated(Event):
+ # Indicate that the protocol has updated (although it has nothing
+ # for the server to do).
+ pass
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/logging.py b/.venv/lib/python3.9/site-packages/hypercorn/logging.py
new file mode 100644
index 0000000..d505e72
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/logging.py
@@ -0,0 +1,182 @@
+import logging
+import os
+import sys
+import time
+from http import HTTPStatus
+from logging.config import dictConfig, fileConfig
+from typing import Any, IO, Mapping, Optional, TYPE_CHECKING, Union
+
+if TYPE_CHECKING:
+ from .config import Config
+ from .typing import ResponseSummary, WWWScope
+
+
+def _create_logger(
+ name: str,
+ target: Union[logging.Logger, str, None],
+ level: Optional[str],
+ sys_default: IO,
+ *,
+ propagate: bool = True,
+) -> Optional[logging.Logger]:
+ if isinstance(target, logging.Logger):
+ return target
+
+ if target:
+ logger = logging.getLogger(name)
+ logger.handlers = [
+ logging.StreamHandler(sys_default) if target == "-" else logging.FileHandler(target)
+ ]
+ logger.propagate = propagate
+ formatter = logging.Formatter(
+ "%(asctime)s [%(process)d] [%(levelname)s] %(message)s",
+ "[%Y-%m-%d %H:%M:%S %z]",
+ )
+ logger.handlers[0].setFormatter(formatter)
+ if level is not None:
+ logger.setLevel(logging.getLevelName(level.upper()))
+ return logger
+ else:
+ return None
+
+
+class Logger:
+ def __init__(self, config: "Config") -> None:
+ self.access_log_format = config.access_log_format
+
+ self.access_logger = _create_logger(
+ "hypercorn.access",
+ config.accesslog,
+ config.loglevel,
+ sys.stdout,
+ propagate=False,
+ )
+ self.error_logger = _create_logger(
+ "hypercorn.error", config.errorlog, config.loglevel, sys.stderr
+ )
+
+ if config.logconfig is not None:
+ log_config = {
+ "__file__": config.logconfig,
+ "here": os.path.dirname(config.logconfig),
+ }
+ fileConfig(config.logconfig, defaults=log_config, disable_existing_loggers=False)
+ else:
+ if config.logconfig_dict is not None:
+ dictConfig(config.logconfig_dict)
+
+ async def access(
+ self, request: "WWWScope", response: "ResponseSummary", request_time: float
+ ) -> None:
+ if self.access_logger is not None:
+ self.access_logger.info(
+ self.access_log_format, self.atoms(request, response, request_time)
+ )
+
+ async def critical(self, message: str, *args: Any, **kwargs: Any) -> None:
+ if self.error_logger is not None:
+ self.error_logger.critical(message, *args, **kwargs)
+
+ async def error(self, message: str, *args: Any, **kwargs: Any) -> None:
+ if self.error_logger is not None:
+ self.error_logger.error(message, *args, **kwargs)
+
+ async def warning(self, message: str, *args: Any, **kwargs: Any) -> None:
+ if self.error_logger is not None:
+ self.error_logger.warning(message, *args, **kwargs)
+
+ async def info(self, message: str, *args: Any, **kwargs: Any) -> None:
+ if self.error_logger is not None:
+ self.error_logger.info(message, *args, **kwargs)
+
+ async def debug(self, message: str, *args: Any, **kwargs: Any) -> None:
+ if self.error_logger is not None:
+ self.error_logger.debug(message, *args, **kwargs)
+
+ async def exception(self, message: str, *args: Any, **kwargs: Any) -> None:
+ if self.error_logger is not None:
+ self.error_logger.exception(message, *args, **kwargs)
+
+ async def log(self, level: int, message: str, *args: Any, **kwargs: Any) -> None:
+ if self.error_logger is not None:
+ self.error_logger.log(level, message, *args, **kwargs)
+
+ def atoms(
+ self, request: "WWWScope", response: "ResponseSummary", request_time: float
+ ) -> Mapping[str, str]:
+ """Create and return an access log atoms dictionary.
+
+ This can be overidden and customised if desired. It should
+ return a mapping between an access log format key and a value.
+ """
+ return AccessLogAtoms(request, response, request_time)
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self.error_logger, name)
+
+
+class AccessLogAtoms(dict):
+ def __init__(
+ self, request: "WWWScope", response: "ResponseSummary", request_time: float
+ ) -> None:
+ for name, value in request["headers"]:
+ self[f"{{{name.decode('latin1').lower()}}}i"] = value.decode("latin1")
+ for name, value in response.get("headers", []):
+ self[f"{{{name.decode('latin1').lower()}}}o"] = value.decode("latin1")
+ for name, value in os.environ.items():
+ self[f"{{{name.lower()}}}e"] = value
+ protocol = request.get("http_version", "ws")
+ client = request.get("client")
+ if client is None:
+ remote_addr = None
+ elif len(client) == 2:
+ remote_addr = f"{client[0]}:{client[1]}"
+ elif len(client) == 1:
+ remote_addr = client[0]
+ else: # make sure not to throw UnboundLocalError
+ remote_addr = f"??{client}???>"
+ if request["type"] == "http":
+ method = request["method"]
+ else:
+ method = "GET"
+ query_string = request["query_string"].decode()
+ path_with_qs = request["path"] + ("?" + query_string if query_string else "")
+ status_code = response["status"]
+ try:
+ status_phrase = HTTPStatus(status_code).phrase
+ except ValueError:
+ status_phrase = f"??{status_code}???>"
+ self.update(
+ {
+ "h": remote_addr,
+ "l": "-",
+ "t": time.strftime("[%d/%b/%Y:%H:%M:%S %z]"),
+ "r": f"{method} {request['path']} {protocol}",
+ "R": f"{method} {path_with_qs} {protocol}",
+ "s": response["status"],
+ "st": status_phrase,
+ "S": request["scheme"],
+ "m": method,
+ "U": request["path"],
+ "Uq": path_with_qs,
+ "q": query_string,
+ "H": protocol,
+ "b": self["{Content-Length}o"],
+ "B": self["{Content-Length}o"],
+ "f": self["{Referer}i"],
+ "a": self["{User-Agent}i"],
+ "T": int(request_time),
+ "D": int(request_time * 1_000_000),
+ "L": f"{request_time:.6f}",
+ "p": f"<{os.getpid()}>",
+ }
+ )
+
+ def __getitem__(self, key: str) -> str:
+ try:
+ if key.startswith("{"):
+ return super().__getitem__(key.lower())
+ else:
+ return super().__getitem__(key)
+ except KeyError:
+ return "-"
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/middleware/__init__.py b/.venv/lib/python3.9/site-packages/hypercorn/middleware/__init__.py
new file mode 100644
index 0000000..57f5a18
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/middleware/__init__.py
@@ -0,0 +1,10 @@
+from .dispatcher import DispatcherMiddleware
+from .http_to_https import HTTPToHTTPSRedirectMiddleware
+from .wsgi import AsyncioWSGIMiddleware, TrioWSGIMiddleware
+
+__all__ = (
+ "AsyncioWSGIMiddleware",
+ "DispatcherMiddleware",
+ "HTTPToHTTPSRedirectMiddleware",
+ "TrioWSGIMiddleware",
+)
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/middleware/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/middleware/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..6db9a81
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/middleware/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/middleware/__pycache__/dispatcher.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/middleware/__pycache__/dispatcher.cpython-39.pyc
new file mode 100644
index 0000000..e0a547b
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/middleware/__pycache__/dispatcher.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/middleware/__pycache__/http_to_https.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/middleware/__pycache__/http_to_https.cpython-39.pyc
new file mode 100644
index 0000000..1ad80c0
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/middleware/__pycache__/http_to_https.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/middleware/__pycache__/wsgi.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/middleware/__pycache__/wsgi.cpython-39.pyc
new file mode 100644
index 0000000..6072c1b
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/middleware/__pycache__/wsgi.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/middleware/dispatcher.py b/.venv/lib/python3.9/site-packages/hypercorn/middleware/dispatcher.py
new file mode 100644
index 0000000..602abf3
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/middleware/dispatcher.py
@@ -0,0 +1,107 @@
+import asyncio
+from functools import partial
+from typing import Callable, Dict
+
+from ..asyncio.task_group import TaskGroup
+from ..typing import ASGIFramework, Scope
+from ..utils import invoke_asgi
+
+MAX_QUEUE_SIZE = 10
+
+
+class _DispatcherMiddleware:
+ def __init__(self, mounts: Dict[str, ASGIFramework]) -> None:
+ self.mounts = mounts
+
+ async def __call__(self, scope: Scope, receive: Callable, send: Callable) -> None:
+ if scope["type"] == "lifespan":
+ await self._handle_lifespan(scope, receive, send)
+ else:
+ for path, app in self.mounts.items():
+ if scope["path"].startswith(path):
+ scope["path"] = scope["path"][len(path) :] or "/" # type: ignore
+ return await invoke_asgi(app, scope, receive, send)
+ await send(
+ {
+ "type": "http.response.start",
+ "status": 404,
+ "headers": [(b"content-length", b"0")],
+ }
+ )
+ await send({"type": "http.response.body"})
+
+ async def _handle_lifespan(self, scope: Scope, receive: Callable, send: Callable) -> None:
+ pass
+
+
+class AsyncioDispatcherMiddleware(_DispatcherMiddleware):
+ async def _handle_lifespan(self, scope: Scope, receive: Callable, send: Callable) -> None:
+ self.app_queues: Dict[str, asyncio.Queue] = {
+ path: asyncio.Queue(MAX_QUEUE_SIZE) for path in self.mounts
+ }
+ self.startup_complete = {path: False for path in self.mounts}
+ self.shutdown_complete = {path: False for path in self.mounts}
+
+ async with TaskGroup(asyncio.get_event_loop()) as task_group:
+ for path, app in self.mounts.items():
+ task_group.spawn(
+ invoke_asgi(
+ app, scope, self.app_queues[path].get, partial(self.send, path, send)
+ )
+ )
+
+ while True:
+ message = await receive()
+ for queue in self.app_queues.values():
+ await queue.put(message)
+ if message["type"] == "lifespan.shutdown":
+ break
+
+ async def send(self, path: str, send: Callable, message: dict) -> None:
+ if message["type"] == "lifespan.startup.complete":
+ self.startup_complete[path] = True
+ if all(self.startup_complete.values()):
+ await send({"type": "lifespan.startup.complete"})
+ elif message["type"] == "lifespan.shutdown.complete":
+ self.shutdown_complete[path] = True
+ if all(self.shutdown_complete.values()):
+ await send({"type": "lifespan.shutdown.complete"})
+
+
+class TrioDispatcherMiddleware(_DispatcherMiddleware):
+ async def _handle_lifespan(self, scope: Scope, receive: Callable, send: Callable) -> None:
+ import trio
+
+ self.app_queues = {path: trio.open_memory_channel(MAX_QUEUE_SIZE) for path in self.mounts}
+ self.startup_complete = {path: False for path in self.mounts}
+ self.shutdown_complete = {path: False for path in self.mounts}
+
+ async with trio.open_nursery() as nursery:
+ for path, app in self.mounts.items():
+ nursery.start_soon(
+ invoke_asgi,
+ app,
+ scope,
+ self.app_queues[path][1].receive,
+ partial(self.send, path, send),
+ )
+
+ while True:
+ message = await receive()
+ for channels in self.app_queues.values():
+ await channels[0].send(message)
+ if message["type"] == "lifespan.shutdown":
+ break
+
+ async def send(self, path: str, send: Callable, message: dict) -> None:
+ if message["type"] == "lifespan.startup.complete":
+ self.startup_complete[path] = True
+ if all(self.startup_complete.values()):
+ await send({"type": "lifespan.startup.complete"})
+ elif message["type"] == "lifespan.shutdown.complete":
+ self.shutdown_complete[path] = True
+ if all(self.shutdown_complete.values()):
+ await send({"type": "lifespan.shutdown.complete"})
+
+
+DispatcherMiddleware = AsyncioDispatcherMiddleware # Remove with version 0.11
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/middleware/http_to_https.py b/.venv/lib/python3.9/site-packages/hypercorn/middleware/http_to_https.py
new file mode 100644
index 0000000..ec1af2d
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/middleware/http_to_https.py
@@ -0,0 +1,66 @@
+from typing import Callable, Optional
+from urllib.parse import urlunsplit
+
+from ..typing import ASGIFramework, HTTPScope, Scope, WebsocketScope, WWWScope
+from ..utils import invoke_asgi
+
+
+class HTTPToHTTPSRedirectMiddleware:
+ def __init__(self, app: ASGIFramework, host: Optional[str]) -> None:
+ self.app = app
+ self.host = host
+
+ async def __call__(self, scope: Scope, receive: Callable, send: Callable) -> None:
+ if scope["type"] == "http" and scope["scheme"] == "http":
+ await self._send_http_redirect(scope, send)
+ elif scope["type"] == "websocket" and scope["scheme"] == "ws":
+ # If the server supports the WebSocket Denial Response
+ # extension we can send a redirection response, if not we
+ # can only deny the WebSocket connection.
+ if "websocket.http.response" in scope.get("extensions", {}):
+ await self._send_websocket_redirect(scope, send)
+ else:
+ await send({"type": "websocket.close"})
+ else:
+ return await invoke_asgi(self.app, scope, receive, send)
+
+ async def _send_http_redirect(self, scope: HTTPScope, send: Callable) -> None:
+ new_url = self._new_url("https", scope)
+ await send(
+ {
+ "type": "http.response.start",
+ "status": 307,
+ "headers": [(b"location", new_url.encode())],
+ }
+ )
+ await send({"type": "http.response.body"})
+
+ async def _send_websocket_redirect(self, scope: WebsocketScope, send: Callable) -> None:
+ # If the HTTP version is 2 we should redirect with a https
+ # scheme not wss.
+ scheme = "wss"
+ if scope.get("http_version", "1.1") == "2":
+ scheme = "https"
+
+ new_url = self._new_url(scheme, scope)
+ await send(
+ {
+ "type": "websocket.http.response.start",
+ "status": 307,
+ "headers": [(b"location", new_url.encode())],
+ }
+ )
+ await send({"type": "websocket.http.response.body"})
+
+ def _new_url(self, scheme: str, scope: WWWScope) -> str:
+ host = self.host
+ if host is None:
+ for key, value in scope["headers"]:
+ if key == b"host":
+ host = value.decode("latin-1")
+ break
+ if host is None:
+ raise ValueError("Host to redirect to cannot be determined")
+
+ path = scope.get("root_path", "") + scope["raw_path"].decode()
+ return urlunsplit((scheme, host, path, scope["query_string"].decode(), ""))
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/middleware/wsgi.py b/.venv/lib/python3.9/site-packages/hypercorn/middleware/wsgi.py
new file mode 100644
index 0000000..b8ebe8e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/middleware/wsgi.py
@@ -0,0 +1,132 @@
+import asyncio
+from functools import partial
+from io import BytesIO
+from typing import Callable, Iterable, List, Optional, Tuple
+
+from ..typing import HTTPScope, Scope
+
+MAX_BODY_SIZE = 2 ** 16
+
+WSGICallable = Callable[[dict, Callable], Iterable[bytes]]
+
+
+class _WSGIMiddleware:
+ def __init__(self, wsgi_app: WSGICallable, max_body_size: int = MAX_BODY_SIZE) -> None:
+ self.wsgi_app = wsgi_app
+ self.max_body_size = max_body_size
+
+ async def __call__(self, scope: Scope, receive: Callable, send: Callable) -> None:
+ if scope["type"] == "http":
+ status_code, headers, body = await self._handle_http(scope, receive, send)
+ await send({"type": "http.response.start", "status": status_code, "headers": headers})
+ await send({"type": "http.response.body", "body": body})
+ elif scope["type"] == "websocket":
+ await send({"type": "websocket.close"})
+ elif scope["type"] == "lifespan":
+ return
+ else:
+ raise Exception(f"Unknown scope type, {scope['type']}")
+
+ async def _handle_http(
+ self, scope: HTTPScope, receive: Callable, send: Callable
+ ) -> Tuple[int, list, bytes]:
+ pass
+
+
+class AsyncioWSGIMiddleware(_WSGIMiddleware):
+ async def _handle_http(
+ self, scope: HTTPScope, receive: Callable, send: Callable
+ ) -> Tuple[int, list, bytes]:
+ loop = asyncio.get_event_loop()
+ instance = _WSGIInstance(self.wsgi_app, self.max_body_size)
+ return await instance.handle_http(scope, receive, partial(loop.run_in_executor, None))
+
+
+class TrioWSGIMiddleware(_WSGIMiddleware):
+ async def _handle_http(
+ self, scope: HTTPScope, receive: Callable, send: Callable
+ ) -> Tuple[int, list, bytes]:
+ import trio
+
+ instance = _WSGIInstance(self.wsgi_app, self.max_body_size)
+ return await instance.handle_http(scope, receive, trio.to_thread.run_sync)
+
+
+class _WSGIInstance:
+ def __init__(self, wsgi_app: WSGICallable, max_body_size: int = MAX_BODY_SIZE) -> None:
+ self.wsgi_app = wsgi_app
+ self.max_body_size = max_body_size
+ self.status_code = 500
+ self.headers: list = []
+
+ async def handle_http(
+ self, scope: HTTPScope, receive: Callable, spawn: Callable
+ ) -> Tuple[int, list, bytes]:
+ self.scope = scope
+ body = bytearray()
+ while True:
+ message = await receive()
+ body.extend(message.get("body", b""))
+ if len(body) > self.max_body_size:
+ return 400, [], b""
+ if not message.get("more_body"):
+ break
+ return await spawn(self.run_wsgi_app, body)
+
+ def _start_response(
+ self,
+ status: str,
+ response_headers: List[Tuple[str, str]],
+ exc_info: Optional[Exception] = None,
+ ) -> None:
+ raw, _ = status.split(" ", 1)
+ self.status_code = int(raw)
+ self.headers = [
+ (name.lower().encode("ascii"), value.encode("ascii"))
+ for name, value in response_headers
+ ]
+
+ def run_wsgi_app(self, body: bytes) -> Tuple[int, list, bytes]:
+ environ = _build_environ(self.scope, body)
+ body = bytearray()
+ for output in self.wsgi_app(environ, self._start_response):
+ body.extend(output)
+ return self.status_code, self.headers, body
+
+
+def _build_environ(scope: HTTPScope, body: bytes) -> dict:
+ server = scope.get("server") or ("localhost", 80)
+ environ = {
+ "REQUEST_METHOD": scope["method"],
+ "SCRIPT_NAME": scope.get("root_path", "").encode("utf8").decode("latin1"),
+ "PATH_INFO": scope["path"].encode("utf8").decode("latin1"),
+ "QUERY_STRING": scope["query_string"].decode("ascii"),
+ "SERVER_NAME": server[0],
+ "SERVER_PORT": server[1],
+ "SERVER_PROTOCOL": "HTTP/%s" % scope["http_version"],
+ "wsgi.version": (1, 0),
+ "wsgi.url_scheme": scope.get("scheme", "http"),
+ "wsgi.input": BytesIO(body),
+ "wsgi.errors": BytesIO(),
+ "wsgi.multithread": True,
+ "wsgi.multiprocess": True,
+ "wsgi.run_once": False,
+ }
+
+ if "client" in scope:
+ environ["REMOTE_ADDR"] = scope["client"][0]
+
+ for raw_name, raw_value in scope.get("headers", []):
+ name = raw_name.decode("latin1")
+ if name == "content-length":
+ corrected_name = "CONTENT_LENGTH"
+ elif name == "content-type":
+ corrected_name = "CONTENT_TYPE"
+ else:
+ corrected_name = "HTTP_%s" % name.upper().replace("-", "_")
+ # HTTPbis say only ASCII chars are allowed in headers, but we latin1 just in case
+ value = raw_value.decode("latin1")
+ if corrected_name in environ:
+ value = environ[corrected_name] + "," + value # type: ignore
+ environ[corrected_name] = value
+ return environ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/__init__.py b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__init__.py
new file mode 100644
index 0000000..d632c10
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__init__.py
@@ -0,0 +1,86 @@
+from typing import Awaitable, Callable, Optional, Tuple, Union
+
+from .h2 import H2Protocol
+from .h11 import H2CProtocolRequired, H2ProtocolAssumed, H11Protocol
+from ..config import Config
+from ..events import Event, RawData
+from ..typing import ASGIFramework, Context
+
+
+class ProtocolWrapper:
+ def __init__(
+ self,
+ app: ASGIFramework,
+ config: Config,
+ context: Context,
+ ssl: bool,
+ client: Optional[Tuple[str, int]],
+ server: Optional[Tuple[str, int]],
+ send: Callable[[Event], Awaitable[None]],
+ alpn_protocol: Optional[str] = None,
+ ) -> None:
+ self.app = app
+ self.config = config
+ self.context = context
+ self.ssl = ssl
+ self.client = client
+ self.server = server
+ self.send = send
+ self.protocol: Union[H11Protocol, H2Protocol]
+ if alpn_protocol == "h2":
+ self.protocol = H2Protocol(
+ self.app,
+ self.config,
+ self.context,
+ self.ssl,
+ self.client,
+ self.server,
+ self.send,
+ )
+ else:
+ self.protocol = H11Protocol(
+ self.app,
+ self.config,
+ self.context,
+ self.ssl,
+ self.client,
+ self.server,
+ self.send,
+ )
+
+ @property
+ def idle(self) -> bool:
+ return self.protocol.idle
+
+ async def initiate(self) -> None:
+ return await self.protocol.initiate()
+
+ async def handle(self, event: Event) -> None:
+ try:
+ return await self.protocol.handle(event)
+ except H2ProtocolAssumed as error:
+ self.protocol = H2Protocol(
+ self.app,
+ self.config,
+ self.context,
+ self.ssl,
+ self.client,
+ self.server,
+ self.send,
+ )
+ await self.protocol.initiate()
+ if error.data != b"":
+ return await self.protocol.handle(RawData(data=error.data))
+ except H2CProtocolRequired as error:
+ self.protocol = H2Protocol(
+ self.app,
+ self.config,
+ self.context,
+ self.ssl,
+ self.client,
+ self.server,
+ self.send,
+ )
+ await self.protocol.initiate(error.headers, error.settings)
+ if error.data != b"":
+ return await self.protocol.handle(RawData(data=error.data))
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..f2543da
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/events.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/events.cpython-39.pyc
new file mode 100644
index 0000000..9e6a693
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/events.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/h11.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/h11.cpython-39.pyc
new file mode 100644
index 0000000..fd19d8a
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/h11.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/h2.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/h2.cpython-39.pyc
new file mode 100644
index 0000000..107e96a
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/h2.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/h3.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/h3.cpython-39.pyc
new file mode 100644
index 0000000..647f0c0
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/h3.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/http_stream.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/http_stream.cpython-39.pyc
new file mode 100644
index 0000000..9b6f3cc
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/http_stream.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/quic.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/quic.cpython-39.pyc
new file mode 100644
index 0000000..db328b0
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/quic.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/ws_stream.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/ws_stream.cpython-39.pyc
new file mode 100644
index 0000000..c1650f4
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/protocol/__pycache__/ws_stream.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/events.py b/.venv/lib/python3.9/site-packages/hypercorn/protocol/events.py
new file mode 100644
index 0000000..15c31d0
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/protocol/events.py
@@ -0,0 +1,46 @@
+from dataclasses import dataclass
+from typing import List, Tuple
+
+
+@dataclass(frozen=True)
+class Event:
+ stream_id: int
+
+
+@dataclass(frozen=True)
+class Request(Event):
+ headers: List[Tuple[bytes, bytes]]
+ http_version: str
+ method: str
+ raw_path: bytes
+
+
+@dataclass(frozen=True)
+class Body(Event):
+ data: bytes
+
+
+@dataclass(frozen=True)
+class EndBody(Event):
+ pass
+
+
+@dataclass(frozen=True)
+class Data(Event):
+ data: bytes
+
+
+@dataclass(frozen=True)
+class EndData(Event):
+ pass
+
+
+@dataclass(frozen=True)
+class Response(Event):
+ headers: List[Tuple[bytes, bytes]]
+ status_code: int
+
+
+@dataclass(frozen=True)
+class StreamClosed(Event):
+ pass
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/h11.py b/.venv/lib/python3.9/site-packages/hypercorn/protocol/h11.py
new file mode 100644
index 0000000..10cf5b7
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/protocol/h11.py
@@ -0,0 +1,289 @@
+from itertools import chain
+from typing import Awaitable, Callable, Optional, Tuple, Union
+
+import h11
+
+from .events import (
+ Body,
+ Data,
+ EndBody,
+ EndData,
+ Event as StreamEvent,
+ Request,
+ Response,
+ StreamClosed,
+)
+from .http_stream import HTTPStream
+from .ws_stream import WSStream
+from ..config import Config
+from ..events import Closed, Event, RawData, Updated
+from ..typing import ASGIFramework, Context, H11SendableEvent
+
+STREAM_ID = 1
+
+
+class H2CProtocolRequired(Exception):
+ def __init__(self, data: bytes, request: h11.Request) -> None:
+ settings = ""
+ headers = [(b":method", request.method), (b":path", request.target)]
+ for name, value in request.headers:
+ if name.lower() == b"http2-settings":
+ settings = value.decode()
+ elif name.lower() == b"host":
+ headers.append((b":authority", value))
+ headers.append((name, value))
+
+ self.data = data
+ self.headers = headers
+ self.settings = settings
+
+
+class H2ProtocolAssumed(Exception):
+ def __init__(self, data: bytes) -> None:
+ self.data = data
+
+
+class H11WSConnection:
+ # This class matches the h11 interface, and either passes data
+ # through without altering it (for Data, EndData) or sends h11
+ # events (Response, Body, EndBody).
+ our_state = None # Prevents recycling the connection
+ they_are_waiting_for_100_continue = False
+
+ def __init__(self, h11_connection: h11.Connection) -> None:
+ self.buffer = bytearray(h11_connection.trailing_data[0])
+ self.h11_connection = h11_connection
+
+ def receive_data(self, data: bytes) -> None:
+ self.buffer.extend(data)
+
+ def next_event(self) -> Data:
+ if self.buffer:
+ event = Data(stream_id=STREAM_ID, data=bytes(self.buffer))
+ self.buffer = bytearray()
+ return event
+ else:
+ return h11.NEED_DATA
+
+ def send(self, event: H11SendableEvent) -> bytes:
+ return self.h11_connection.send(event)
+
+
+class H11Protocol:
+ def __init__(
+ self,
+ app: ASGIFramework,
+ config: Config,
+ context: Context,
+ ssl: bool,
+ client: Optional[Tuple[str, int]],
+ server: Optional[Tuple[str, int]],
+ send: Callable[[Event], Awaitable[None]],
+ ) -> None:
+ self.app = app
+ self.can_read = context.event_class()
+ self.client = client
+ self.config = config
+ self.connection = h11.Connection(
+ h11.SERVER, max_incomplete_event_size=self.config.h11_max_incomplete_size
+ )
+ self.context = context
+ self.send = send
+ self.server = server
+ self.ssl = ssl
+ self.stream: Optional[Union[HTTPStream, WSStream]] = None
+
+ @property
+ def idle(self) -> bool:
+ return self.stream is None or self.stream.idle
+
+ async def initiate(self) -> None:
+ pass
+
+ async def handle(self, event: Event) -> None:
+ if isinstance(event, RawData):
+ self.connection.receive_data(event.data)
+ await self._handle_events()
+ elif isinstance(event, Closed):
+ if self.stream is not None:
+ await self._close_stream()
+
+ async def stream_send(self, event: StreamEvent) -> None:
+ if isinstance(event, Response):
+ if event.status_code >= 200:
+ await self._send_h11_event(
+ h11.Response(
+ headers=chain(event.headers, self.config.response_headers("h11")),
+ status_code=event.status_code,
+ )
+ )
+ else:
+ await self._send_h11_event(
+ h11.InformationalResponse(
+ headers=chain(event.headers, self.config.response_headers("h11")),
+ status_code=event.status_code,
+ )
+ )
+ elif isinstance(event, Body):
+ await self._send_h11_event(h11.Data(data=event.data))
+ elif isinstance(event, EndBody):
+ await self._send_h11_event(h11.EndOfMessage())
+ elif isinstance(event, Data):
+ await self.send(RawData(data=event.data))
+ elif isinstance(event, EndData):
+ pass
+ elif isinstance(event, StreamClosed):
+ await self._maybe_recycle()
+
+ async def _handle_events(self) -> None:
+ while True:
+ if self.connection.they_are_waiting_for_100_continue:
+ await self._send_h11_event(
+ h11.InformationalResponse(
+ status_code=100, headers=self.config.response_headers("h11")
+ )
+ )
+
+ try:
+ event = self.connection.next_event()
+ except h11.RemoteProtocolError:
+ if self.connection.our_state in {h11.IDLE, h11.SEND_RESPONSE}:
+ await self._send_error_response(400)
+ await self.send(Closed())
+ break
+ else:
+ if isinstance(event, h11.Request):
+ await self._check_protocol(event)
+ await self._create_stream(event)
+ elif event is h11.PAUSED:
+ await self.can_read.clear()
+ await self.send(Updated())
+ await self.can_read.wait()
+ elif isinstance(event, h11.ConnectionClosed) or event is h11.NEED_DATA:
+ break
+ elif self.stream is None:
+ break
+ elif isinstance(event, h11.Data):
+ await self.stream.handle(Body(stream_id=STREAM_ID, data=event.data))
+ elif isinstance(event, h11.EndOfMessage):
+ await self.stream.handle(EndBody(stream_id=STREAM_ID))
+ elif isinstance(event, Data):
+ # WebSocket pass through
+ await self.stream.handle(event)
+
+ async def _create_stream(self, request: h11.Request) -> None:
+ upgrade_value = ""
+ connection_value = ""
+ for name, value in request.headers:
+ sanitised_name = name.decode("latin1").strip().lower()
+ if sanitised_name == "upgrade":
+ upgrade_value = value.decode("latin1").strip()
+ elif sanitised_name == "connection":
+ connection_value = value.decode("latin1").strip()
+
+ connection_tokens = connection_value.lower().split(",")
+ if (
+ any(token.strip() == "upgrade" for token in connection_tokens)
+ and upgrade_value.lower() == "websocket"
+ and request.method.decode("ascii").upper() == "GET"
+ ):
+ self.stream = WSStream(
+ self.app,
+ self.config,
+ self.context,
+ self.ssl,
+ self.client,
+ self.server,
+ self.stream_send,
+ STREAM_ID,
+ )
+ self.connection = H11WSConnection(self.connection)
+ else:
+ self.stream = HTTPStream(
+ self.app,
+ self.config,
+ self.context,
+ self.ssl,
+ self.client,
+ self.server,
+ self.stream_send,
+ STREAM_ID,
+ )
+ await self.stream.handle(
+ Request(
+ stream_id=STREAM_ID,
+ headers=request.headers,
+ http_version=request.http_version.decode(),
+ method=request.method.decode("ascii").upper(),
+ raw_path=request.target,
+ )
+ )
+
+ async def _send_h11_event(self, event: H11SendableEvent) -> None:
+ try:
+ data = self.connection.send(event)
+ except h11.LocalProtocolError:
+ if self.connection.their_state != h11.ERROR:
+ raise
+ else:
+ await self.send(RawData(data=data))
+
+ async def _send_error_response(self, status_code: int) -> None:
+ await self._send_h11_event(
+ h11.Response(
+ status_code=status_code,
+ headers=chain(
+ [(b"content-length", b"0"), (b"connection", b"close")],
+ self.config.response_headers("h11"),
+ ),
+ )
+ )
+ await self._send_h11_event(h11.EndOfMessage())
+
+ async def _maybe_recycle(self) -> None:
+ await self._close_stream()
+ if self.connection.our_state is h11.DONE:
+ try:
+ self.connection.start_next_cycle()
+ except h11.LocalProtocolError:
+ await self.send(Closed())
+ else:
+ self.response = None
+ self.scope = None
+ await self.can_read.set()
+ await self.send(Updated())
+ else:
+ await self.can_read.set()
+ await self.send(Closed())
+
+ async def _close_stream(self) -> None:
+ if self.stream is not None:
+ await self.stream.handle(StreamClosed(stream_id=STREAM_ID))
+ self.stream = None
+
+ async def _check_protocol(self, event: h11.Request) -> None:
+ upgrade_value = ""
+ has_body = False
+ for name, value in event.headers:
+ sanitised_name = name.decode("latin1").strip().lower()
+ if sanitised_name == "upgrade":
+ upgrade_value = value.decode("latin1").strip()
+ elif sanitised_name in {"content-length", "transfer-encoding"}:
+ has_body = True
+
+ # h2c Upgrade requests with a body are a pain as the body must
+ # be fully recieved in HTTP/1.1 before the upgrade response
+ # and HTTP/2 takes over, so Hypercorn ignores the upgrade and
+ # responds in HTTP/1.1. Use a preflight OPTIONS request to
+ # initiate the upgrade if really required (or just use h2).
+ if upgrade_value.lower() == "h2c" and not has_body:
+ await self._send_h11_event(
+ h11.InformationalResponse(
+ status_code=101,
+ headers=self.config.response_headers("h11")
+ + [(b"connection", b"upgrade"), (b"upgrade", b"h2c")],
+ )
+ )
+ raise H2CProtocolRequired(self.connection.trailing_data[0], event)
+ elif event.method == b"PRI" and event.target == b"*" and event.http_version == b"2.0":
+ raise H2ProtocolAssumed(b"PRI * HTTP/2.0\r\n\r\n" + self.connection.trailing_data[0])
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/h2.py b/.venv/lib/python3.9/site-packages/hypercorn/protocol/h2.py
new file mode 100644
index 0000000..232f9b9
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/protocol/h2.py
@@ -0,0 +1,362 @@
+from typing import Awaitable, Callable, Dict, List, Optional, Tuple, Type, Union
+
+import h2
+import h2.connection
+import h2.events
+import h2.exceptions
+import priority
+
+from .events import (
+ Body,
+ Data,
+ EndBody,
+ EndData,
+ Event as StreamEvent,
+ Request,
+ Response,
+ StreamClosed,
+)
+from .http_stream import HTTPStream
+from .ws_stream import WSStream
+from ..config import Config
+from ..events import Closed, Event, RawData, Updated
+from ..typing import ASGIFramework, Context, Event as IOEvent
+from ..utils import filter_pseudo_headers
+
+BUFFER_HIGH_WATER = 2 * 2 ** 14 # Twice the default max frame size (two frames worth)
+BUFFER_LOW_WATER = BUFFER_HIGH_WATER / 2
+
+
+class BufferCompleteError(Exception):
+ pass
+
+
+class StreamBuffer:
+ def __init__(self, event_class: Type[IOEvent]) -> None:
+ self.buffer = bytearray()
+ self._complete = False
+ self._is_empty = event_class()
+ self._paused = event_class()
+
+ async def drain(self) -> None:
+ await self._is_empty.wait()
+
+ def set_complete(self) -> None:
+ self._complete = True
+
+ async def close(self) -> None:
+ self._complete = True
+ self.buffer = bytearray()
+ await self._is_empty.set()
+ await self._paused.set()
+
+ @property
+ def complete(self) -> bool:
+ return self._complete and len(self.buffer) == 0
+
+ async def push(self, data: bytes) -> None:
+ if self._complete:
+ raise BufferCompleteError()
+ self.buffer.extend(data)
+ await self._is_empty.clear()
+ if len(self.buffer) >= BUFFER_HIGH_WATER:
+ await self._paused.wait()
+ await self._paused.clear()
+
+ async def pop(self, max_length: int) -> bytes:
+ length = min(len(self.buffer), max_length)
+ data = bytes(self.buffer[:length])
+ del self.buffer[:length]
+ if len(data) < BUFFER_LOW_WATER:
+ await self._paused.set()
+ if len(self.buffer) == 0:
+ await self._is_empty.set()
+ return data
+
+
+class H2Protocol:
+ def __init__(
+ self,
+ app: ASGIFramework,
+ config: Config,
+ context: Context,
+ ssl: bool,
+ client: Optional[Tuple[str, int]],
+ server: Optional[Tuple[str, int]],
+ send: Callable[[Event], Awaitable[None]],
+ ) -> None:
+ self.app = app
+ self.client = client
+ self.closed = False
+ self.config = config
+ self.context = context
+
+ self.connection = h2.connection.H2Connection(
+ config=h2.config.H2Configuration(client_side=False, header_encoding=None)
+ )
+ self.connection.DEFAULT_MAX_INBOUND_FRAME_SIZE = config.h2_max_inbound_frame_size
+ self.connection.local_settings = h2.settings.Settings(
+ client=False,
+ initial_values={
+ h2.settings.SettingCodes.MAX_CONCURRENT_STREAMS: config.h2_max_concurrent_streams,
+ h2.settings.SettingCodes.MAX_HEADER_LIST_SIZE: config.h2_max_header_list_size,
+ h2.settings.SettingCodes.ENABLE_CONNECT_PROTOCOL: 1,
+ },
+ )
+
+ self.send = send
+ self.server = server
+ self.ssl = ssl
+ self.streams: Dict[int, Union[HTTPStream, WSStream]] = {}
+ # The below are used by the sending task
+ self.has_data = self.context.event_class()
+ self.priority = priority.PriorityTree()
+ self.stream_buffers: Dict[int, StreamBuffer] = {}
+
+ @property
+ def idle(self) -> bool:
+ return len(self.streams) == 0 or all(stream.idle for stream in self.streams.values())
+
+ async def initiate(
+ self, headers: Optional[List[Tuple[bytes, bytes]]] = None, settings: Optional[str] = None
+ ) -> None:
+ if settings is not None:
+ self.connection.initiate_upgrade_connection(settings)
+ else:
+ self.connection.initiate_connection()
+ await self._flush()
+ if headers is not None:
+ event = h2.events.RequestReceived()
+ event.stream_id = 1
+ event.headers = headers
+ await self._create_stream(event)
+ await self.streams[event.stream_id].handle(EndBody(stream_id=event.stream_id))
+ self.context.spawn(self.send_task)
+
+ async def send_task(self) -> None:
+ # This should be run in a seperate task to the rest of this
+ # class. This allows it seperately choose when to send,
+ # crucially in what order.
+ while not self.closed:
+ try:
+ stream_id = next(self.priority)
+ except priority.DeadlockError:
+ await self.has_data.wait()
+ await self.has_data.clear()
+ else:
+ await self._send_data(stream_id)
+
+ async def _send_data(self, stream_id: int) -> None:
+ try:
+ chunk_size = min(
+ self.connection.local_flow_control_window(stream_id),
+ self.connection.max_outbound_frame_size,
+ )
+ chunk_size = max(0, chunk_size)
+ data = await self.stream_buffers[stream_id].pop(chunk_size)
+ if data:
+ self.connection.send_data(stream_id, data)
+ await self._flush()
+ else:
+ self.priority.block(stream_id)
+
+ if self.stream_buffers[stream_id].complete:
+ self.connection.end_stream(stream_id)
+ await self._flush()
+ del self.stream_buffers[stream_id]
+ self.priority.remove_stream(stream_id)
+ except (h2.exceptions.StreamClosedError, KeyError, h2.exceptions.ProtocolError):
+ # Stream or connection has closed whilst waiting to send
+ # data, not a problem - just force close it.
+ await self.stream_buffers[stream_id].close()
+ del self.stream_buffers[stream_id]
+ self.priority.remove_stream(stream_id)
+
+ async def handle(self, event: Event) -> None:
+ if isinstance(event, RawData):
+ try:
+ events = self.connection.receive_data(event.data)
+ except h2.exceptions.ProtocolError:
+ await self._flush()
+ await self.send(Closed())
+ else:
+ await self._handle_events(events)
+ elif isinstance(event, Closed):
+ self.closed = True
+ stream_ids = list(self.streams.keys())
+ for stream_id in stream_ids:
+ await self._close_stream(stream_id)
+ await self.has_data.set()
+
+ async def stream_send(self, event: StreamEvent) -> None:
+ try:
+ if isinstance(event, Response):
+ self.connection.send_headers(
+ event.stream_id,
+ [(b":status", b"%d" % event.status_code)]
+ + event.headers
+ + self.config.response_headers("h2"),
+ )
+ await self._flush()
+ elif isinstance(event, (Body, Data)):
+ self.priority.unblock(event.stream_id)
+ await self.has_data.set()
+ await self.stream_buffers[event.stream_id].push(event.data)
+ elif isinstance(event, (EndBody, EndData)):
+ self.stream_buffers[event.stream_id].set_complete()
+ self.priority.unblock(event.stream_id)
+ await self.has_data.set()
+ await self.stream_buffers[event.stream_id].drain()
+ elif isinstance(event, StreamClosed):
+ await self._close_stream(event.stream_id)
+ await self.send(Updated())
+ elif isinstance(event, Request):
+ await self._create_server_push(event.stream_id, event.raw_path, event.headers)
+ except (
+ BufferCompleteError,
+ KeyError,
+ priority.MissingStreamError,
+ h2.exceptions.ProtocolError,
+ ):
+ # Connection has closed whilst blocked on flow control or
+ # connection has advanced ahead of the last emitted event.
+ return
+
+ async def _handle_events(self, events: List[h2.events.Event]) -> None:
+ for event in events:
+ if isinstance(event, h2.events.RequestReceived):
+ await self._create_stream(event)
+ elif isinstance(event, h2.events.DataReceived):
+ await self.streams[event.stream_id].handle(
+ Body(stream_id=event.stream_id, data=event.data)
+ )
+ self.connection.acknowledge_received_data(
+ event.flow_controlled_length, event.stream_id
+ )
+ elif isinstance(event, h2.events.StreamEnded):
+ await self.streams[event.stream_id].handle(EndBody(stream_id=event.stream_id))
+ elif isinstance(event, h2.events.StreamReset):
+ await self._close_stream(event.stream_id)
+ await self._window_updated(event.stream_id)
+ elif isinstance(event, h2.events.WindowUpdated):
+ await self._window_updated(event.stream_id)
+ elif isinstance(event, h2.events.PriorityUpdated):
+ await self._priority_updated(event)
+ elif isinstance(event, h2.events.RemoteSettingsChanged):
+ if h2.settings.SettingCodes.INITIAL_WINDOW_SIZE in event.changed_settings:
+ await self._window_updated(None)
+ elif isinstance(event, h2.events.ConnectionTerminated):
+ await self.send(Closed())
+ await self._flush()
+
+ async def _flush(self) -> None:
+ data = self.connection.data_to_send()
+ if data != b"":
+ await self.send(RawData(data=data))
+
+ async def _window_updated(self, stream_id: Optional[int]) -> None:
+ if stream_id is None or stream_id == 0:
+ # Unblock all streams
+ for stream_id in list(self.stream_buffers.keys()):
+ self.priority.unblock(stream_id)
+ elif stream_id is not None and stream_id in self.stream_buffers:
+ self.priority.unblock(stream_id)
+ await self.has_data.set()
+
+ async def _priority_updated(self, event: h2.events.PriorityUpdated) -> None:
+ try:
+ self.priority.reprioritize(
+ stream_id=event.stream_id,
+ depends_on=event.depends_on or None,
+ weight=event.weight,
+ exclusive=event.exclusive,
+ )
+ except priority.MissingStreamError:
+ # Received PRIORITY frame before HEADERS frame
+ self.priority.insert_stream(
+ stream_id=event.stream_id,
+ depends_on=event.depends_on or None,
+ weight=event.weight,
+ exclusive=event.exclusive,
+ )
+ self.priority.block(event.stream_id)
+ await self.has_data.set()
+
+ async def _create_stream(self, request: h2.events.RequestReceived) -> None:
+ for name, value in request.headers:
+ if name == b":method":
+ method = value.decode("ascii").upper()
+ elif name == b":path":
+ raw_path = value
+
+ if method == "CONNECT":
+ self.streams[request.stream_id] = WSStream(
+ self.app,
+ self.config,
+ self.context,
+ self.ssl,
+ self.client,
+ self.server,
+ self.stream_send,
+ request.stream_id,
+ )
+ else:
+ self.streams[request.stream_id] = HTTPStream(
+ self.app,
+ self.config,
+ self.context,
+ self.ssl,
+ self.client,
+ self.server,
+ self.stream_send,
+ request.stream_id,
+ )
+ self.stream_buffers[request.stream_id] = StreamBuffer(self.context.event_class)
+ try:
+ self.priority.insert_stream(request.stream_id)
+ except priority.DuplicateStreamError:
+ # Recieved PRIORITY frame before HEADERS frame
+ pass
+ else:
+ self.priority.block(request.stream_id)
+
+ await self.streams[request.stream_id].handle(
+ Request(
+ stream_id=request.stream_id,
+ headers=filter_pseudo_headers(request.headers),
+ http_version="2",
+ method=method,
+ raw_path=raw_path,
+ )
+ )
+
+ async def _create_server_push(
+ self, stream_id: int, path: bytes, headers: List[Tuple[bytes, bytes]]
+ ) -> None:
+ push_stream_id = self.connection.get_next_available_stream_id()
+ request_headers = [(b":method", b"GET"), (b":path", path)]
+ request_headers.extend(headers)
+ request_headers.extend(self.config.response_headers("h2"))
+ try:
+ self.connection.push_stream(
+ stream_id=stream_id,
+ promised_stream_id=push_stream_id,
+ request_headers=request_headers,
+ )
+ await self._flush()
+ except h2.exceptions.ProtocolError:
+ # Client does not accept push promises or we are trying to
+ # push on a push promises request.
+ pass
+ else:
+ event = h2.events.RequestReceived()
+ event.stream_id = push_stream_id
+ event.headers = request_headers
+ await self._create_stream(event)
+ await self.streams[event.stream_id].handle(EndBody(stream_id=event.stream_id))
+
+ async def _close_stream(self, stream_id: int) -> None:
+ if stream_id in self.streams:
+ stream = self.streams.pop(stream_id)
+ await stream.handle(StreamClosed(stream_id=stream_id))
+ await self.has_data.set()
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/h3.py b/.venv/lib/python3.9/site-packages/hypercorn/protocol/h3.py
new file mode 100644
index 0000000..1102ba5
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/protocol/h3.py
@@ -0,0 +1,138 @@
+from typing import Awaitable, Callable, Dict, List, Optional, Tuple, Union
+
+from aioquic.h3.connection import H3Connection
+from aioquic.h3.events import DataReceived, HeadersReceived
+from aioquic.h3.exceptions import NoAvailablePushIDError
+from aioquic.quic.connection import QuicConnection
+from aioquic.quic.events import QuicEvent
+
+from .events import (
+ Body,
+ Data,
+ EndBody,
+ EndData,
+ Event as StreamEvent,
+ Request,
+ Response,
+ StreamClosed,
+)
+from .http_stream import HTTPStream
+from .ws_stream import WSStream
+from ..config import Config
+from ..typing import ASGIFramework, Context
+from ..utils import filter_pseudo_headers
+
+
+class H3Protocol:
+ def __init__(
+ self,
+ app: ASGIFramework,
+ config: Config,
+ context: Context,
+ client: Optional[Tuple[str, int]],
+ server: Optional[Tuple[str, int]],
+ quic: QuicConnection,
+ send: Callable[[], Awaitable[None]],
+ ) -> None:
+ self.app = app
+ self.client = client
+ self.config = config
+ self.context = context
+ self.connection = H3Connection(quic)
+ self.send = send
+ self.server = server
+ self.streams: Dict[int, Union[HTTPStream, WSStream]] = {}
+
+ async def handle(self, quic_event: QuicEvent) -> None:
+ for event in self.connection.handle_event(quic_event):
+ if isinstance(event, HeadersReceived):
+ await self._create_stream(event)
+ if event.stream_ended:
+ await self.streams[event.stream_id].handle(EndBody(stream_id=event.stream_id))
+ elif isinstance(event, DataReceived):
+ await self.streams[event.stream_id].handle(
+ Body(stream_id=event.stream_id, data=event.data)
+ )
+ if event.stream_ended:
+ await self.streams[event.stream_id].handle(EndBody(stream_id=event.stream_id))
+
+ async def stream_send(self, event: StreamEvent) -> None:
+ if isinstance(event, Response):
+ self.connection.send_headers(
+ event.stream_id,
+ [(b":status", b"%d" % event.status_code)]
+ + event.headers
+ + self.config.response_headers("h3"),
+ )
+ await self.send()
+ elif isinstance(event, (Body, Data)):
+ self.connection.send_data(event.stream_id, event.data, False)
+ await self.send()
+ elif isinstance(event, (EndBody, EndData)):
+ self.connection.send_data(event.stream_id, b"", True)
+ await self.send()
+ elif isinstance(event, StreamClosed):
+ pass # ??
+ elif isinstance(event, Request):
+ await self._create_server_push(event.stream_id, event.raw_path, event.headers)
+
+ async def _create_stream(self, request: HeadersReceived) -> None:
+ for name, value in request.headers:
+ if name == b":method":
+ method = value.decode("ascii").upper()
+ elif name == b":path":
+ raw_path = value
+
+ if method == "CONNECT":
+ self.streams[request.stream_id] = WSStream(
+ self.app,
+ self.config,
+ self.context,
+ True,
+ self.client,
+ self.server,
+ self.stream_send,
+ request.stream_id,
+ )
+ else:
+ self.streams[request.stream_id] = HTTPStream(
+ self.app,
+ self.config,
+ self.context,
+ True,
+ self.client,
+ self.server,
+ self.stream_send,
+ request.stream_id,
+ )
+
+ await self.streams[request.stream_id].handle(
+ Request(
+ stream_id=request.stream_id,
+ headers=filter_pseudo_headers(request.headers),
+ http_version="3",
+ method=method,
+ raw_path=raw_path,
+ )
+ )
+
+ async def _create_server_push(
+ self, stream_id: int, path: bytes, headers: List[Tuple[bytes, bytes]]
+ ) -> None:
+ request_headers = [(b":method", b"GET"), (b":path", path)]
+ request_headers.extend(headers)
+ request_headers.extend(self.config.response_headers("h3"))
+ try:
+ push_stream_id = self.connection.send_push_promise(
+ stream_id=stream_id, headers=request_headers
+ )
+ except NoAvailablePushIDError:
+ # Client does not accept push promises or we are trying to
+ # push on a push promises request.
+ pass
+ else:
+ event = HeadersReceived(
+ stream_id=push_stream_id, stream_ended=True, headers=request_headers
+ )
+ await self._create_stream(event)
+ await self.streams[event.stream_id].handle(EndBody(stream_id=event.stream_id))
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/http_stream.py b/.venv/lib/python3.9/site-packages/hypercorn/protocol/http_stream.py
new file mode 100644
index 0000000..14f68aa
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/protocol/http_stream.py
@@ -0,0 +1,175 @@
+from enum import auto, Enum
+from time import time
+from typing import Awaitable, Callable, Optional, Tuple
+from urllib.parse import unquote
+
+from .events import Body, EndBody, Event, Request, Response, StreamClosed
+from ..config import Config
+from ..typing import ASGIFramework, ASGISendEvent, Context, HTTPResponseStartEvent, HTTPScope
+from ..utils import build_and_validate_headers, suppress_body, UnexpectedMessage, valid_server_name
+
+PUSH_VERSIONS = {"2", "3"}
+
+
+class ASGIHTTPState(Enum):
+ # The ASGI Spec is clear that a response should not start till the
+ # framework has sent at least one body message hence why this
+ # state tracking is required.
+ REQUEST = auto()
+ RESPONSE = auto()
+ CLOSED = auto()
+
+
+class HTTPStream:
+ def __init__(
+ self,
+ app: ASGIFramework,
+ config: Config,
+ context: Context,
+ ssl: bool,
+ client: Optional[Tuple[str, int]],
+ server: Optional[Tuple[str, int]],
+ send: Callable[[Event], Awaitable[None]],
+ stream_id: int,
+ ) -> None:
+ self.app = app
+ self.client = client
+ self.closed = False
+ self.config = config
+ self.context = context
+ self.response: HTTPResponseStartEvent
+ self.scope: HTTPScope
+ self.send = send
+ self.scheme = "https" if ssl else "http"
+ self.server = server
+ self.start_time: float
+ self.state = ASGIHTTPState.REQUEST
+ self.stream_id = stream_id
+
+ @property
+ def idle(self) -> bool:
+ return False
+
+ async def handle(self, event: Event) -> None:
+ if self.closed:
+ return
+ elif isinstance(event, Request):
+ self.start_time = time()
+ path, _, query_string = event.raw_path.partition(b"?")
+ self.scope = {
+ "type": "http",
+ "http_version": event.http_version,
+ "asgi": {"spec_version": "2.1"},
+ "method": event.method,
+ "scheme": self.scheme,
+ "path": unquote(path.decode("ascii")),
+ "raw_path": path,
+ "query_string": query_string,
+ "root_path": self.config.root_path,
+ "headers": event.headers,
+ "client": self.client,
+ "server": self.server,
+ "extensions": {},
+ }
+ if event.http_version in PUSH_VERSIONS:
+ self.scope["extensions"]["http.response.push"] = {}
+
+ if valid_server_name(self.config, event):
+ self.app_put = await self.context.spawn_app(
+ self.app, self.config, self.scope, self.app_send
+ )
+ else:
+ await self._send_error_response(404)
+ self.closed = True
+
+ elif isinstance(event, Body):
+ await self.app_put(
+ {"type": "http.request", "body": bytes(event.data), "more_body": True}
+ )
+ elif isinstance(event, EndBody):
+ await self.app_put({"type": "http.request", "body": b"", "more_body": False})
+ elif isinstance(event, StreamClosed):
+ self.closed = True
+ if self.app_put is not None:
+ await self.app_put({"type": "http.disconnect"}) # type: ignore
+
+ async def app_send(self, message: Optional[ASGISendEvent]) -> None:
+ if self.closed:
+ # Allow app to finish after close
+ return
+
+ if message is None: # ASGI App has finished sending messages
+ # Cleanup if required
+ if self.state == ASGIHTTPState.REQUEST:
+ await self._send_error_response(500)
+ await self.send(StreamClosed(stream_id=self.stream_id))
+ else:
+ if message["type"] == "http.response.start" and self.state == ASGIHTTPState.REQUEST:
+ self.response = message
+ elif (
+ message["type"] == "http.response.push"
+ and self.scope["http_version"] in PUSH_VERSIONS
+ ):
+ if not isinstance(message["path"], str):
+ raise TypeError(f"{message['path']} should be a str")
+ headers = [(b":scheme", self.scope["scheme"].encode())]
+ for name, value in self.scope["headers"]:
+ if name == b"host":
+ headers.append((b":authority", value))
+ headers.extend(build_and_validate_headers(message["headers"]))
+ await self.send(
+ Request(
+ stream_id=self.stream_id,
+ headers=headers,
+ http_version=self.scope["http_version"],
+ method="GET",
+ raw_path=message["path"].encode(),
+ )
+ )
+ elif message["type"] == "http.response.body" and self.state in {
+ ASGIHTTPState.REQUEST,
+ ASGIHTTPState.RESPONSE,
+ }:
+ if self.state == ASGIHTTPState.REQUEST:
+ headers = build_and_validate_headers(self.response.get("headers", []))
+ await self.send(
+ Response(
+ stream_id=self.stream_id,
+ headers=headers,
+ status_code=int(self.response["status"]),
+ )
+ )
+ self.state = ASGIHTTPState.RESPONSE
+
+ if (
+ not suppress_body(self.scope["method"], int(self.response["status"]))
+ and message.get("body", b"") != b""
+ ):
+ await self.send(
+ Body(stream_id=self.stream_id, data=bytes(message.get("body", b"")))
+ )
+
+ if not message.get("more_body", False):
+ if self.state != ASGIHTTPState.CLOSED:
+ self.state = ASGIHTTPState.CLOSED
+ await self.config.log.access(
+ self.scope, self.response, time() - self.start_time
+ )
+ await self.send(EndBody(stream_id=self.stream_id))
+ await self.send(StreamClosed(stream_id=self.stream_id))
+ else:
+ raise UnexpectedMessage(self.state, message["type"])
+
+ async def _send_error_response(self, status_code: int) -> None:
+ await self.send(
+ Response(
+ stream_id=self.stream_id,
+ headers=[(b"content-length", b"0"), (b"connection", b"close")],
+ status_code=status_code,
+ )
+ )
+ await self.send(EndBody(stream_id=self.stream_id))
+ self.state = ASGIHTTPState.CLOSED
+ await self.config.log.access(
+ self.scope, {"status": status_code, "headers": []}, time() - self.start_time
+ )
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/quic.py b/.venv/lib/python3.9/site-packages/hypercorn/protocol/quic.py
new file mode 100644
index 0000000..839f0ba
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/protocol/quic.py
@@ -0,0 +1,125 @@
+from functools import partial
+from typing import Awaitable, Callable, Dict, Optional, Tuple
+
+from aioquic.buffer import Buffer
+from aioquic.h3.connection import H3_ALPN
+from aioquic.quic.configuration import QuicConfiguration
+from aioquic.quic.connection import QuicConnection
+from aioquic.quic.events import (
+ ConnectionIdIssued,
+ ConnectionIdRetired,
+ ConnectionTerminated,
+ ProtocolNegotiated,
+)
+from aioquic.quic.packet import (
+ encode_quic_version_negotiation,
+ PACKET_TYPE_INITIAL,
+ pull_quic_header,
+)
+
+from .h3 import H3Protocol
+from ..config import Config
+from ..events import Closed, Event, RawData
+from ..typing import ASGIFramework, Context
+
+
+class QuicProtocol:
+ def __init__(
+ self,
+ app: ASGIFramework,
+ config: Config,
+ context: Context,
+ server: Optional[Tuple[str, int]],
+ send: Callable[[Event], Awaitable[None]],
+ ) -> None:
+ self.app = app
+ self.config = config
+ self.context = context
+ self.connections: Dict[bytes, QuicConnection] = {}
+ self.http_connections: Dict[QuicConnection, H3Protocol] = {}
+ self.send = send
+ self.server = server
+
+ self.quic_config = QuicConfiguration(alpn_protocols=H3_ALPN, is_client=False)
+ self.quic_config.load_cert_chain(certfile=config.certfile, keyfile=config.keyfile)
+
+ async def handle(self, event: Event) -> None:
+ if isinstance(event, RawData):
+ try:
+ header = pull_quic_header(Buffer(data=event.data), host_cid_length=8)
+ except ValueError:
+ return
+ if (
+ header.version is not None
+ and header.version not in self.quic_config.supported_versions
+ ):
+ data = encode_quic_version_negotiation(
+ source_cid=header.destination_cid,
+ destination_cid=header.source_cid,
+ supported_versions=self.quic_config.supported_versions,
+ )
+ await self.send(RawData(data=data, address=event.address))
+ return
+
+ connection = self.connections.get(header.destination_cid)
+ if (
+ connection is None
+ and len(event.data) >= 1200
+ and header.packet_type == PACKET_TYPE_INITIAL
+ ):
+ connection = QuicConnection(
+ configuration=self.quic_config,
+ original_destination_connection_id=header.destination_cid,
+ )
+ self.connections[header.destination_cid] = connection
+ self.connections[connection.host_cid] = connection
+
+ if connection is not None:
+ connection.receive_datagram(event.data, event.address, now=self.context.time())
+ await self._handle_events(connection, event.address)
+ elif isinstance(event, Closed):
+ pass
+
+ async def send_all(self, connection: QuicConnection) -> None:
+ for data, address in connection.datagrams_to_send(now=self.context.time()):
+ await self.send(RawData(data=data, address=address))
+
+ async def _handle_events(
+ self, connection: QuicConnection, client: Optional[Tuple[str, int]] = None
+ ) -> None:
+ event = connection.next_event()
+ while event is not None:
+ if isinstance(event, ConnectionTerminated):
+ pass
+ elif isinstance(event, ProtocolNegotiated):
+ self.http_connections[connection] = H3Protocol(
+ self.app,
+ self.config,
+ self.context,
+ client,
+ self.server,
+ connection,
+ partial(self.send_all, connection),
+ )
+ elif isinstance(event, ConnectionIdIssued):
+ self.connections[event.connection_id] = connection
+ elif isinstance(event, ConnectionIdRetired):
+ del self.connections[event.connection_id]
+
+ if connection in self.http_connections:
+ await self.http_connections[connection].handle(event)
+
+ event = connection.next_event()
+
+ await self.send_all(connection)
+
+ timer = connection.get_timer()
+ if timer is not None:
+ self.context.spawn(self._handle_timer, timer, connection)
+
+ async def _handle_timer(self, timer: float, connection: QuicConnection) -> None:
+ wait = max(0, timer - self.context.time())
+ await self.context.sleep(wait)
+ if connection._close_at is not None:
+ connection.handle_timer(now=self.context.time())
+ await self._handle_events(connection, None)
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/protocol/ws_stream.py b/.venv/lib/python3.9/site-packages/hypercorn/protocol/ws_stream.py
new file mode 100644
index 0000000..8f15b29
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/protocol/ws_stream.py
@@ -0,0 +1,349 @@
+from enum import auto, Enum
+from time import time
+from typing import Awaitable, Callable, List, Optional, Tuple, Union
+from urllib.parse import unquote
+
+from wsproto.connection import Connection, ConnectionState, ConnectionType
+from wsproto.events import (
+ BytesMessage,
+ CloseConnection,
+ Event as WSProtoEvent,
+ Message,
+ Ping,
+ TextMessage,
+)
+from wsproto.extensions import Extension, PerMessageDeflate
+from wsproto.frame_protocol import CloseReason
+from wsproto.handshake import server_extensions_handshake, WEBSOCKET_VERSION
+from wsproto.utilities import generate_accept_token, split_comma_header
+
+from .events import Body, Data, EndBody, EndData, Event, Request, Response, StreamClosed
+from ..config import Config
+from ..typing import (
+ ASGIFramework,
+ ASGISendEvent,
+ Context,
+ WebsocketAcceptEvent,
+ WebsocketResponseBodyEvent,
+ WebsocketResponseStartEvent,
+ WebsocketScope,
+)
+from ..utils import build_and_validate_headers, suppress_body, UnexpectedMessage, valid_server_name
+
+
+class ASGIWebsocketState(Enum):
+ # Hypercorn supports the ASGI websocket HTTP response extension,
+ # which allows HTTP responses rather than acceptance.
+ HANDSHAKE = auto()
+ CONNECTED = auto()
+ RESPONSE = auto()
+ CLOSED = auto()
+ HTTPCLOSED = auto()
+
+
+class FrameTooLarge(Exception):
+ pass
+
+
+class Handshake:
+ def __init__(self, headers: List[Tuple[bytes, bytes]], http_version: str) -> None:
+ self.http_version = http_version
+ self.connection_tokens: Optional[List[str]] = None
+ self.extensions: Optional[List[str]] = None
+ self.key: Optional[bytes] = None
+ self.subprotocols: Optional[List[str]] = None
+ self.upgrade: Optional[bytes] = None
+ self.version: Optional[bytes] = None
+ for name, value in headers:
+ name = name.lower()
+ if name == b"connection":
+ self.connection_tokens = split_comma_header(value)
+ elif name == b"sec-websocket-extensions":
+ self.extensions = split_comma_header(value)
+ elif name == b"sec-websocket-key":
+ self.key = value
+ elif name == b"sec-websocket-protocol":
+ self.subprotocols = split_comma_header(value)
+ elif name == b"sec-websocket-version":
+ self.version = value
+ elif name == b"upgrade":
+ self.upgrade = value
+
+ def is_valid(self) -> bool:
+ if self.http_version < "1.1":
+ return False
+ elif self.http_version == "1.1":
+ if self.key is None:
+ return False
+ if self.connection_tokens is None or not any(
+ token.lower() == "upgrade" for token in self.connection_tokens
+ ):
+ return False
+ if self.upgrade.lower() != b"websocket":
+ return False
+
+ if self.version != WEBSOCKET_VERSION:
+ return False
+ return True
+
+ def accept(
+ self, subprotocol: Optional[str]
+ ) -> Tuple[int, List[Tuple[bytes, bytes]], Connection]:
+ headers = []
+ if subprotocol is not None:
+ if subprotocol not in self.subprotocols:
+ raise Exception("Invalid Subprotocol")
+ else:
+ headers.append((b"sec-websocket-protocol", subprotocol.encode()))
+
+ extensions: List[Extension] = [PerMessageDeflate()]
+ accepts = None
+ if False and self.extensions is not None:
+ accepts = server_extensions_handshake(self.extensions, extensions)
+
+ if accepts:
+ headers.append((b"sec-websocket-extensions", accepts))
+
+ if self.key is not None:
+ headers.append((b"sec-websocket-accept", generate_accept_token(self.key)))
+
+ status_code = 200
+ if self.http_version == "1.1":
+ headers.extend([(b"upgrade", b"WebSocket"), (b"connection", b"Upgrade")])
+ status_code = 101
+
+ return status_code, headers, Connection(ConnectionType.SERVER, extensions)
+
+
+class WebsocketBuffer:
+ def __init__(self, max_length: int) -> None:
+ self.value: Optional[Union[bytes, str]] = None
+ self.max_length = max_length
+
+ def extend(self, event: Message) -> None:
+ if self.value is None:
+ if isinstance(event, TextMessage):
+ self.value = ""
+ else:
+ self.value = b""
+ self.value += event.data
+ if len(self.value) > self.max_length:
+ raise FrameTooLarge()
+
+ def clear(self) -> None:
+ self.value = None
+
+ def to_message(self) -> dict:
+ return {
+ "type": "websocket.receive",
+ "bytes": self.value if isinstance(self.value, bytes) else None,
+ "text": self.value if isinstance(self.value, str) else None,
+ }
+
+
+class WSStream:
+ def __init__(
+ self,
+ app: ASGIFramework,
+ config: Config,
+ context: Context,
+ ssl: bool,
+ client: Optional[Tuple[str, int]],
+ server: Optional[Tuple[str, int]],
+ send: Callable[[Event], Awaitable[None]],
+ stream_id: int,
+ ) -> None:
+ self.app = app
+ self.app_put: Optional[Callable] = None
+ self.buffer = WebsocketBuffer(config.websocket_max_message_size)
+ self.client = client
+ self.closed = False
+ self.config = config
+ self.context = context
+ self.response: WebsocketResponseStartEvent
+ self.scope: WebsocketScope
+ self.send = send
+ # RFC 8441 for HTTP/2 says use http or https, ASGI says ws or wss
+ self.scheme = "wss" if ssl else "ws"
+ self.server = server
+ self.start_time: float
+ self.state = ASGIWebsocketState.HANDSHAKE
+ self.stream_id = stream_id
+
+ self.connection: Connection
+ self.handshake: Handshake
+
+ @property
+ def idle(self) -> bool:
+ return self.state in {ASGIWebsocketState.CLOSED, ASGIWebsocketState.HTTPCLOSED}
+
+ async def handle(self, event: Event) -> None:
+ if self.closed:
+ return
+ elif isinstance(event, Request):
+ self.start_time = time()
+ self.handshake = Handshake(event.headers, event.http_version)
+ path, _, query_string = event.raw_path.partition(b"?")
+ self.scope = {
+ "type": "websocket",
+ "asgi": {"spec_version": "2.1"},
+ "scheme": self.scheme,
+ "http_version": event.http_version,
+ "path": unquote(path.decode("ascii")),
+ "raw_path": path,
+ "query_string": query_string,
+ "root_path": self.config.root_path,
+ "headers": event.headers,
+ "client": self.client,
+ "server": self.server,
+ "subprotocols": self.handshake.subprotocols or [],
+ "extensions": {"websocket.http.response": {}},
+ }
+
+ if not valid_server_name(self.config, event):
+ await self._send_error_response(404)
+ self.closed = True
+ elif not self.handshake.is_valid():
+ await self._send_error_response(400)
+ self.closed = True
+ else:
+ self.app_put = await self.context.spawn_app(
+ self.app, self.config, self.scope, self.app_send
+ )
+ await self.app_put({"type": "websocket.connect"}) # type: ignore
+ elif isinstance(event, (Body, Data)):
+ self.connection.receive_data(event.data)
+ await self._handle_events()
+ elif isinstance(event, StreamClosed):
+ self.closed = True
+ if self.app_put is not None:
+ if self.state in {ASGIWebsocketState.HTTPCLOSED, ASGIWebsocketState.CLOSED}:
+ code = CloseReason.NORMAL_CLOSURE.value
+ else:
+ code = CloseReason.ABNORMAL_CLOSURE.value
+ await self.app_put({"type": "websocket.disconnect", "code": code})
+
+ async def app_send(self, message: Optional[ASGISendEvent]) -> None:
+ if self.closed:
+ # Allow app to finish after close
+ return
+
+ if message is None: # ASGI App has finished sending messages
+ # Cleanup if required
+ if self.state == ASGIWebsocketState.HANDSHAKE:
+ await self._send_error_response(500)
+ await self.config.log.access(
+ self.scope, {"status": 500, "headers": []}, time() - self.start_time
+ )
+ elif self.state == ASGIWebsocketState.CONNECTED:
+ await self._send_wsproto_event(CloseConnection(code=CloseReason.ABNORMAL_CLOSURE))
+ await self.send(StreamClosed(stream_id=self.stream_id))
+ else:
+ if message["type"] == "websocket.accept" and self.state == ASGIWebsocketState.HANDSHAKE:
+ await self._accept(message)
+ elif (
+ message["type"] == "websocket.http.response.start"
+ and self.state == ASGIWebsocketState.HANDSHAKE
+ ):
+ self.response = message
+ elif message["type"] == "websocket.http.response.body" and self.state in {
+ ASGIWebsocketState.HANDSHAKE,
+ ASGIWebsocketState.RESPONSE,
+ }:
+ await self._send_rejection(message)
+ elif message["type"] == "websocket.send" and self.state == ASGIWebsocketState.CONNECTED:
+ event: WSProtoEvent
+ if message.get("bytes") is not None:
+ event = BytesMessage(data=bytes(message["bytes"]))
+ elif not isinstance(message["text"], str):
+ raise TypeError(f"{message['text']} should be a str")
+ else:
+ event = TextMessage(data=message["text"])
+ await self._send_wsproto_event(event)
+ elif (
+ message["type"] == "websocket.close" and self.state == ASGIWebsocketState.HANDSHAKE
+ ):
+ self.state = ASGIWebsocketState.HTTPCLOSED
+ await self._send_error_response(403)
+ elif message["type"] == "websocket.close":
+ self.state = ASGIWebsocketState.CLOSED
+ await self._send_wsproto_event(
+ CloseConnection(code=int(message.get("code", CloseReason.NORMAL_CLOSURE)))
+ )
+ await self.send(EndData(stream_id=self.stream_id))
+ else:
+ raise UnexpectedMessage(self.state, message["type"])
+
+ async def _handle_events(self) -> None:
+ for event in self.connection.events():
+ if isinstance(event, Message):
+ try:
+ self.buffer.extend(event)
+ except FrameTooLarge:
+ await self._send_wsproto_event(
+ CloseConnection(code=CloseReason.MESSAGE_TOO_BIG)
+ )
+ break
+
+ if event.message_finished:
+ await self.app_put(self.buffer.to_message())
+ self.buffer.clear()
+ elif isinstance(event, Ping):
+ await self._send_wsproto_event(event.response())
+ elif isinstance(event, CloseConnection):
+ if self.connection.state == ConnectionState.REMOTE_CLOSING:
+ await self._send_wsproto_event(event.response())
+ await self.send(StreamClosed(stream_id=self.stream_id))
+
+ async def _send_error_response(self, status_code: int) -> None:
+ await self.send(
+ Response(
+ stream_id=self.stream_id,
+ status_code=status_code,
+ headers=[(b"content-length", b"0"), (b"connection", b"close")],
+ )
+ )
+ await self.send(EndBody(stream_id=self.stream_id))
+ await self.config.log.access(
+ self.scope, {"status": status_code, "headers": []}, time() - self.start_time
+ )
+
+ async def _send_wsproto_event(self, event: WSProtoEvent) -> None:
+ data = self.connection.send(event)
+ await self.send(Data(stream_id=self.stream_id, data=data))
+
+ async def _accept(self, message: WebsocketAcceptEvent) -> None:
+ self.state = ASGIWebsocketState.CONNECTED
+ status_code, headers, self.connection = self.handshake.accept(message.get("subprotocol"))
+ await self.send(
+ Response(stream_id=self.stream_id, status_code=status_code, headers=headers)
+ )
+ await self.config.log.access(
+ self.scope, {"status": status_code, "headers": []}, time() - self.start_time
+ )
+ if self.config.websocket_ping_interval is not None:
+ self.context.spawn(self._send_pings)
+
+ async def _send_rejection(self, message: WebsocketResponseBodyEvent) -> None:
+ body_suppressed = suppress_body("GET", self.response["status"])
+ if self.state == ASGIWebsocketState.HANDSHAKE:
+ headers = build_and_validate_headers(self.response["headers"])
+ await self.send(
+ Response(
+ stream_id=self.stream_id,
+ status_code=int(self.response["status"]),
+ headers=headers,
+ )
+ )
+ self.state = ASGIWebsocketState.RESPONSE
+ if not body_suppressed:
+ await self.send(Body(stream_id=self.stream_id, data=bytes(message.get("body", b""))))
+ if not message.get("more_body", False):
+ self.state = ASGIWebsocketState.HTTPCLOSED
+ await self.send(EndBody(stream_id=self.stream_id))
+ await self.config.log.access(self.scope, self.response, time() - self.start_time)
+
+ async def _send_pings(self) -> None:
+ while not self.closed:
+ await self._send_wsproto_event(Ping())
+ await self.context.sleep(self.config.websocket_ping_interval)
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/py.typed b/.venv/lib/python3.9/site-packages/hypercorn/py.typed
new file mode 100644
index 0000000..f5642f7
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/py.typed
@@ -0,0 +1 @@
+Marker
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/run.py b/.venv/lib/python3.9/site-packages/hypercorn/run.py
new file mode 100644
index 0000000..aa24d69
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/run.py
@@ -0,0 +1,80 @@
+import platform
+import random
+import signal
+import time
+from multiprocessing import Event, Process
+from typing import Any
+
+from .config import Config
+from .typing import WorkerFunc
+from .utils import write_pid_file
+
+
+def run(config: Config) -> None:
+ if config.pid_path is not None:
+ write_pid_file(config.pid_path)
+
+ worker_func: WorkerFunc
+ if config.worker_class == "asyncio":
+ from .asyncio.run import asyncio_worker
+
+ worker_func = asyncio_worker
+ elif config.worker_class == "uvloop":
+ from .asyncio.run import uvloop_worker
+
+ worker_func = uvloop_worker
+ elif config.worker_class == "trio":
+ from .trio.run import trio_worker
+
+ worker_func = trio_worker
+ else:
+ raise ValueError(f"No worker of class {config.worker_class} exists")
+
+ if config.workers == 1:
+ worker_func(config)
+ else:
+ run_multiple(config, worker_func)
+
+
+def run_multiple(config: Config, worker_func: WorkerFunc) -> None:
+ if config.use_reloader:
+ raise RuntimeError("Reloader can only be used with a single worker")
+
+ sockets = config.create_sockets()
+
+ processes = []
+
+ # Ignore SIGINT before creating the processes, so that they
+ # inherit the signal handling. This means that the shutdown
+ # function controls the shutdown.
+ signal.signal(signal.SIGINT, signal.SIG_IGN)
+
+ shutdown_event = Event()
+
+ for _ in range(config.workers):
+ process = Process(
+ target=worker_func,
+ kwargs={"config": config, "shutdown_event": shutdown_event, "sockets": sockets},
+ )
+ process.daemon = True
+ process.start()
+ processes.append(process)
+ if platform.system() == "Windows":
+ time.sleep(0.1 * random.random())
+
+ def shutdown(*args: Any) -> None:
+ shutdown_event.set()
+
+ for signal_name in {"SIGINT", "SIGTERM", "SIGBREAK"}:
+ if hasattr(signal, signal_name):
+ signal.signal(getattr(signal, signal_name), shutdown)
+
+ for process in processes:
+ process.join()
+ for process in processes:
+ process.terminate()
+
+ for sock in sockets.secure_sockets:
+ sock.close()
+ for sock in sockets.insecure_sockets:
+ sock.close()
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/statsd.py b/.venv/lib/python3.9/site-packages/hypercorn/statsd.py
new file mode 100644
index 0000000..9675428
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/statsd.py
@@ -0,0 +1,93 @@
+from typing import Any, TYPE_CHECKING
+
+from .logging import Logger
+
+if TYPE_CHECKING:
+ from .config import Config
+ from .typing import ResponseSummary, WWWScope
+
+METRIC_VAR = "metric"
+VALUE_VAR = "value"
+MTYPE_VAR = "mtype"
+GAUGE_TYPE = "gauge"
+COUNTER_TYPE = "counter"
+HISTOGRAM_TYPE = "histogram"
+
+
+class StatsdLogger(Logger):
+ def __init__(self, config: "Config") -> None:
+ super().__init__(config)
+ self.dogstatsd_tags = config.dogstatsd_tags
+ self.prefix = config.statsd_prefix
+ if len(self.prefix) and self.prefix[-1] != ".":
+ self.prefix += "."
+
+ async def critical(self, message: str, *args: Any, **kwargs: Any) -> None:
+ await super().critical(message, *args, **kwargs)
+ await self.increment("hypercorn.log.critical", 1)
+
+ async def error(self, message: str, *args: Any, **kwargs: Any) -> None:
+ await super().error(message, *args, **kwargs)
+ self.increment("hypercorn.log.error", 1)
+
+ async def warning(self, message: str, *args: Any, **kwargs: Any) -> None:
+ await super().warning(message, *args, **kwargs)
+ self.increment("hypercorn.log.warning", 1)
+
+ async def info(self, message: str, *args: Any, **kwargs: Any) -> None:
+ await super().info(message, *args, **kwargs)
+
+ async def debug(self, message: str, *args: Any, **kwargs: Any) -> None:
+ await super().debug(message, *args, **kwargs)
+
+ async def exception(self, message: str, *args: Any, **kwargs: Any) -> None:
+ await super().exception(message, *args, **kwargs)
+ await self.increment("hypercorn.log.exception", 1)
+
+ async def log(self, level: int, message: str, *args: Any, **kwargs: Any) -> None:
+ try:
+ extra = kwargs.get("extra", None)
+ if extra is not None:
+ metric = extra.get(METRIC_VAR, None)
+ value = extra.get(VALUE_VAR, None)
+ type_ = extra.get(MTYPE_VAR, None)
+ if metric and value and type_:
+ if type_ == GAUGE_TYPE:
+ await self.gauge(metric, value)
+ elif type_ == COUNTER_TYPE:
+ await self.increment(metric, value)
+ elif type_ == HISTOGRAM_TYPE:
+ await self.histogram(metric, value)
+
+ if message:
+ await super().log(level, message, *args, **kwargs)
+ except Exception:
+ await super().warning("Failed to log to statsd", exc_info=True)
+
+ async def access(
+ self, request: "WWWScope", response: "ResponseSummary", request_time: float
+ ) -> None:
+ await super().access(request, response, request_time)
+ await self.histogram("hypercorn.request.duration", request_time * 1_000)
+ await self.increment("hypercorn.requests", 1)
+ await self.increment(f"hypercorn.request.status.{response['status']}", 1)
+
+ async def gauge(self, name: str, value: int) -> None:
+ await self._send(f"{self.prefix}{name}:{value}|g")
+
+ async def increment(self, name: str, value: int, sampling_rate: float = 1.0) -> None:
+ await self._send(f"{self.prefix}{name}:{value}|c|@{sampling_rate}")
+
+ async def decrement(self, name: str, value: int, sampling_rate: float = 1.0) -> None:
+ await self._send(f"{self.prefix}{name}:-{value}|c|@{sampling_rate}")
+
+ async def histogram(self, name: str, value: float) -> None:
+ await self._send(f"{self.prefix}{name}:{value}|ms")
+
+ async def _send(self, message: str) -> None:
+ if self.dogstatsd_tags:
+ message = f"{message}|#{self.dogstatsd_tags}"
+ await self._socket_send(message.encode("ascii"))
+
+ async def _socket_send(self, message: bytes) -> None:
+ raise NotImplementedError()
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/trio/__init__.py b/.venv/lib/python3.9/site-packages/hypercorn/trio/__init__.py
new file mode 100644
index 0000000..d51ad9e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/trio/__init__.py
@@ -0,0 +1,42 @@
+import warnings
+from typing import Awaitable, Callable, Optional
+
+import trio
+
+from .run import worker_serve
+from ..config import Config
+from ..typing import ASGIFramework
+
+
+async def serve(
+ app: ASGIFramework,
+ config: Config,
+ *,
+ shutdown_trigger: Optional[Callable[..., Awaitable[None]]] = None,
+ task_status: trio._core._run._TaskStatus = trio.TASK_STATUS_IGNORED,
+) -> None:
+ """Serve an ASGI framework app given the config.
+
+ This allows for a programmatic way to serve an ASGI framework, it
+ can be used via,
+
+ .. code-block:: python
+
+ trio.run(serve, app, config)
+
+ It is assumed that the event-loop is configured before calling
+ this function, therefore configuration values that relate to loop
+ setup or process setup are ignored.
+
+ Arguments:
+ app: The ASGI application to serve.
+ config: A Hypercorn configuration object.
+ shutdown_trigger: This should return to trigger a graceful
+ shutdown.
+ """
+ if config.debug:
+ warnings.warn("The config `debug` has no affect when using serve", Warning)
+ if config.workers != 1:
+ warnings.warn("The config `workers` has no affect when using serve", Warning)
+
+ await worker_serve(app, config, shutdown_trigger=shutdown_trigger, task_status=task_status)
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..467b0e4
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/context.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/context.cpython-39.pyc
new file mode 100644
index 0000000..00346c5
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/context.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/lifespan.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/lifespan.cpython-39.pyc
new file mode 100644
index 0000000..a16974c
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/lifespan.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/run.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/run.cpython-39.pyc
new file mode 100644
index 0000000..35b618a
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/run.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/statsd.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/statsd.cpython-39.pyc
new file mode 100644
index 0000000..ac7c65a
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/statsd.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/tcp_server.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/tcp_server.cpython-39.pyc
new file mode 100644
index 0000000..3abbdad
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/tcp_server.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/udp_server.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/udp_server.cpython-39.pyc
new file mode 100644
index 0000000..2326bb9
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hypercorn/trio/__pycache__/udp_server.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/trio/context.py b/.venv/lib/python3.9/site-packages/hypercorn/trio/context.py
new file mode 100644
index 0000000..d54804c
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/trio/context.py
@@ -0,0 +1,83 @@
+from typing import Any, Awaitable, Callable, Optional, Type, Union
+
+import trio
+
+from ..config import Config
+from ..typing import (
+ ASGIFramework,
+ ASGIReceiveCallable,
+ ASGIReceiveEvent,
+ ASGISendEvent,
+ Event,
+ Scope,
+)
+from ..utils import invoke_asgi
+
+
+class EventWrapper:
+ def __init__(self) -> None:
+ self._event = trio.Event()
+
+ async def clear(self) -> None:
+ self._event = trio.Event()
+
+ async def wait(self) -> None:
+ await self._event.wait()
+
+ async def set(self) -> None:
+ self._event.set()
+
+
+async def _handle(
+ app: ASGIFramework,
+ config: Config,
+ scope: Scope,
+ receive: ASGIReceiveCallable,
+ send: Callable[[Optional[ASGISendEvent]], Awaitable[None]],
+) -> None:
+ try:
+ await invoke_asgi(app, scope, receive, send)
+ except trio.Cancelled:
+ raise
+ except trio.MultiError as error:
+ errors = trio.MultiError.filter(
+ lambda exc: None if isinstance(exc, trio.Cancelled) else exc, root_exc=error
+ )
+ if errors is not None:
+ await config.log.exception("Error in ASGI Framework")
+ await send(None)
+ else:
+ raise
+ except Exception:
+ await config.log.exception("Error in ASGI Framework")
+ finally:
+ await send(None)
+
+
+class Context:
+ event_class: Type[Event] = EventWrapper
+
+ def __init__(self, nursery: trio._core._run.Nursery) -> None:
+ self.nursery = nursery
+
+ async def spawn_app(
+ self,
+ app: ASGIFramework,
+ config: Config,
+ scope: Scope,
+ send: Callable[[Optional[ASGISendEvent]], Awaitable[None]],
+ ) -> Callable[[ASGIReceiveEvent], Awaitable[None]]:
+ app_send_channel, app_receive_channel = trio.open_memory_channel(config.max_app_queue_size)
+ self.nursery.start_soon(_handle, app, config, scope, app_receive_channel.receive, send)
+ return app_send_channel.send
+
+ def spawn(self, func: Callable, *args: Any) -> None:
+ self.nursery.start_soon(func, *args)
+
+ @staticmethod
+ async def sleep(wait: Union[float, int]) -> None:
+ return await trio.sleep(wait)
+
+ @staticmethod
+ def time() -> float:
+ return trio.current_time()
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/trio/lifespan.py b/.venv/lib/python3.9/site-packages/hypercorn/trio/lifespan.py
new file mode 100644
index 0000000..8ed8e5a
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/trio/lifespan.py
@@ -0,0 +1,79 @@
+import trio
+
+from ..config import Config
+from ..typing import ASGIFramework, ASGIReceiveEvent, ASGISendEvent, LifespanScope
+from ..utils import invoke_asgi, LifespanFailure, LifespanTimeout
+
+
+class UnexpectedMessage(Exception):
+ pass
+
+
+class Lifespan:
+ def __init__(self, app: ASGIFramework, config: Config) -> None:
+ self.app = app
+ self.config = config
+ self.startup = trio.Event()
+ self.shutdown = trio.Event()
+ self.app_send_channel, self.app_receive_channel = trio.open_memory_channel(
+ config.max_app_queue_size
+ )
+ self.supported = True
+
+ async def handle_lifespan(
+ self, *, task_status: trio._core._run._TaskStatus = trio.TASK_STATUS_IGNORED
+ ) -> None:
+ task_status.started()
+ scope: LifespanScope = {"type": "lifespan", "asgi": {"spec_version": "2.0"}}
+ try:
+ await invoke_asgi(self.app, scope, self.asgi_receive, self.asgi_send)
+ except LifespanFailure:
+ # Lifespan failures should crash the server
+ raise
+ except Exception:
+ self.supported = False
+ await self.config.log.exception(
+ "ASGI Framework Lifespan error, continuing without Lifespan support"
+ )
+ finally:
+ self.startup.set()
+ self.shutdown.set()
+ await self.app_send_channel.aclose()
+ await self.app_receive_channel.aclose()
+
+ async def wait_for_startup(self) -> None:
+ if not self.supported:
+ return
+
+ await self.app_send_channel.send({"type": "lifespan.startup"})
+ try:
+ with trio.fail_after(self.config.startup_timeout):
+ await self.startup.wait()
+ except trio.TooSlowError as error:
+ raise LifespanTimeout("startup") from error
+
+ async def wait_for_shutdown(self) -> None:
+ if not self.supported:
+ return
+
+ await self.app_send_channel.send({"type": "lifespan.shutdown"})
+ try:
+ with trio.fail_after(self.config.shutdown_timeout):
+ await self.shutdown.wait()
+ except trio.TooSlowError as error:
+ raise LifespanTimeout("startup") from error
+
+ async def asgi_receive(self) -> ASGIReceiveEvent:
+ return await self.app_receive_channel.receive()
+
+ async def asgi_send(self, message: ASGISendEvent) -> None:
+ if message["type"] == "lifespan.startup.complete":
+ self.startup.set()
+ elif message["type"] == "lifespan.shutdown.complete":
+ self.shutdown.set()
+ elif message["type"] == "lifespan.startup.failed":
+ raise LifespanFailure("startup", message["message"])
+ elif message["type"] == "lifespan.shutdown.failed":
+ raise LifespanFailure("shutdown", message["message"])
+ else:
+ raise UnexpectedMessage(message["type"])
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/trio/run.py b/.venv/lib/python3.9/site-packages/hypercorn/trio/run.py
new file mode 100644
index 0000000..a8a7fc1
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/trio/run.py
@@ -0,0 +1,119 @@
+from functools import partial
+from multiprocessing.synchronize import Event as EventType
+from typing import Awaitable, Callable, Optional
+
+import trio
+
+from .lifespan import Lifespan
+from .statsd import StatsdLogger
+from .tcp_server import TCPServer
+from .udp_server import UDPServer
+from ..config import Config, Sockets
+from ..typing import ASGIFramework
+from ..utils import (
+ check_multiprocess_shutdown_event,
+ load_application,
+ MustReloadException,
+ observe_changes,
+ raise_shutdown,
+ repr_socket_addr,
+ restart,
+ Shutdown,
+)
+
+
+async def worker_serve(
+ app: ASGIFramework,
+ config: Config,
+ *,
+ sockets: Optional[Sockets] = None,
+ shutdown_trigger: Optional[Callable[..., Awaitable[None]]] = None,
+ task_status: trio._core._run._TaskStatus = trio.TASK_STATUS_IGNORED,
+) -> None:
+ config.set_statsd_logger_class(StatsdLogger)
+
+ lifespan = Lifespan(app, config)
+ reload_ = False
+
+ async with trio.open_nursery() as lifespan_nursery:
+ await lifespan_nursery.start(lifespan.handle_lifespan)
+ await lifespan.wait_for_startup()
+
+ try:
+ async with trio.open_nursery() as nursery:
+ if config.use_reloader:
+ nursery.start_soon(observe_changes, trio.sleep)
+
+ if shutdown_trigger is not None:
+ nursery.start_soon(raise_shutdown, shutdown_trigger)
+
+ if sockets is None:
+ sockets = config.create_sockets()
+ for sock in sockets.secure_sockets:
+ sock.listen(config.backlog)
+ for sock in sockets.insecure_sockets:
+ sock.listen(config.backlog)
+
+ ssl_context = config.create_ssl_context()
+ listeners = []
+ binds = []
+ for sock in sockets.secure_sockets:
+ listeners.append(
+ trio.SSLListener(
+ trio.SocketListener(trio.socket.from_stdlib_socket(sock)),
+ ssl_context,
+ https_compatible=True,
+ )
+ )
+ bind = repr_socket_addr(sock.family, sock.getsockname())
+ binds.append(f"https://{bind}")
+ await config.log.info(f"Running on https://{bind} (CTRL + C to quit)")
+
+ for sock in sockets.insecure_sockets:
+ listeners.append(trio.SocketListener(trio.socket.from_stdlib_socket(sock)))
+ bind = repr_socket_addr(sock.family, sock.getsockname())
+ binds.append(f"http://{bind}")
+ await config.log.info(f"Running on http://{bind} (CTRL + C to quit)")
+
+ for sock in sockets.quic_sockets:
+ await nursery.start(UDPServer(app, config, sock, nursery).run)
+ bind = repr_socket_addr(sock.family, sock.getsockname())
+ await config.log.info(f"Running on https://{bind} (QUIC) (CTRL + C to quit)")
+
+ task_status.started(binds)
+ await trio.serve_listeners(
+ partial(TCPServer, app, config), listeners, handler_nursery=lifespan_nursery
+ )
+
+ except MustReloadException:
+ reload_ = True
+ except (Shutdown, KeyboardInterrupt):
+ pass
+ finally:
+ try:
+ await trio.sleep(config.graceful_timeout)
+ except (Shutdown, KeyboardInterrupt):
+ pass
+
+ await lifespan.wait_for_shutdown()
+ lifespan_nursery.cancel_scope.cancel()
+
+ if reload_:
+ restart()
+
+
+def trio_worker(
+ config: Config, sockets: Optional[Sockets] = None, shutdown_event: Optional[EventType] = None
+) -> None:
+ if sockets is not None:
+ for sock in sockets.secure_sockets:
+ sock.listen(config.backlog)
+ for sock in sockets.insecure_sockets:
+ sock.listen(config.backlog)
+ app = load_application(config.application_path)
+
+ shutdown_trigger = None
+ if shutdown_event is not None:
+ shutdown_trigger = partial(check_multiprocess_shutdown_event, shutdown_event, trio.sleep)
+
+ trio.run(partial(worker_serve, app, config, sockets=sockets, shutdown_trigger=shutdown_trigger))
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/trio/statsd.py b/.venv/lib/python3.9/site-packages/hypercorn/trio/statsd.py
new file mode 100644
index 0000000..00c0901
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/trio/statsd.py
@@ -0,0 +1,14 @@
+import trio
+
+from ..config import Config
+from ..statsd import StatsdLogger as Base
+
+
+class StatsdLogger(Base):
+ def __init__(self, config: Config) -> None:
+ super().__init__(config)
+ self.address = config.statsd_host.rsplit(":", 1)
+ self.socket = trio.socket.socket(trio.socket.AF_INET, trio.socket.SOCK_DGRAM)
+
+ async def _socket_send(self, message: bytes) -> None:
+ await self.socket.sendto(message, self.address)
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/trio/tcp_server.py b/.venv/lib/python3.9/site-packages/hypercorn/trio/tcp_server.py
new file mode 100644
index 0000000..ab652b0
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/trio/tcp_server.py
@@ -0,0 +1,152 @@
+from typing import Any, Callable, Generator, Optional
+
+import trio
+
+from .context import Context
+from ..config import Config
+from ..events import Closed, Event, RawData, Updated
+from ..protocol import ProtocolWrapper
+from ..typing import ASGIFramework
+from ..utils import parse_socket_addr
+
+MAX_RECV = 2 ** 16
+
+
+class EventWrapper:
+ def __init__(self) -> None:
+ self._event = trio.Event()
+
+ async def clear(self) -> None:
+ self._event = trio.Event()
+
+ async def wait(self) -> None:
+ await self._event.wait()
+
+ async def set(self) -> None:
+ self._event.set()
+
+
+class TCPServer:
+ def __init__(self, app: ASGIFramework, config: Config, stream: trio.abc.Stream) -> None:
+ self.app = app
+ self.config = config
+ self.protocol: ProtocolWrapper
+ self.send_lock = trio.Lock()
+ self.timeout_lock = trio.Lock()
+ self.stream = stream
+
+ self._keep_alive_timeout_handle: Optional[trio.CancelScope] = None
+
+ def __await__(self) -> Generator[Any, None, None]:
+ return self.run().__await__()
+
+ async def run(self) -> None:
+ try:
+ try:
+ with trio.fail_after(self.config.ssl_handshake_timeout):
+ await self.stream.do_handshake()
+ except (trio.BrokenResourceError, trio.TooSlowError):
+ return # Handshake failed
+ alpn_protocol = self.stream.selected_alpn_protocol()
+ socket = self.stream.transport_stream.socket
+ ssl = True
+ except AttributeError: # Not SSL
+ alpn_protocol = "http/1.1"
+ socket = self.stream.socket
+ ssl = False
+
+ try:
+ client = parse_socket_addr(socket.family, socket.getpeername())
+ server = parse_socket_addr(socket.family, socket.getsockname())
+
+ async with trio.open_nursery() as nursery:
+ self.nursery = nursery
+ context = Context(nursery)
+ self.protocol = ProtocolWrapper(
+ self.app,
+ self.config,
+ context,
+ ssl,
+ client,
+ server,
+ self.protocol_send,
+ alpn_protocol,
+ )
+ await self.protocol.initiate()
+ await self._update_keep_alive_timeout()
+ await self._read_data()
+ except (trio.MultiError, OSError):
+ pass
+ finally:
+ await self._close()
+
+ async def protocol_send(self, event: Event) -> None:
+ if isinstance(event, RawData):
+ async with self.send_lock:
+ try:
+ with trio.CancelScope() as cancel_scope:
+ cancel_scope.shield = True
+ await self.stream.send_all(event.data)
+ except (trio.BrokenResourceError, trio.ClosedResourceError):
+ await self.protocol.handle(Closed())
+ elif isinstance(event, Closed):
+ await self._close()
+ await self.protocol.handle(Closed())
+ elif isinstance(event, Updated):
+ pass # Triggers the keep alive timeout update
+ await self._update_keep_alive_timeout()
+
+ async def _read_data(self) -> None:
+ while True:
+ try:
+ data = await self.stream.receive_some(MAX_RECV)
+ except (trio.ClosedResourceError, trio.BrokenResourceError):
+ await self.protocol.handle(Closed())
+ break
+ else:
+ if data == b"":
+ await self._update_keep_alive_timeout()
+ break
+ await self.protocol.handle(RawData(data))
+ await self._update_keep_alive_timeout()
+
+ async def _close(self) -> None:
+ try:
+ await self.stream.send_eof()
+ except (
+ trio.BrokenResourceError,
+ AttributeError,
+ trio.BusyResourceError,
+ trio.ClosedResourceError,
+ ):
+ # They're already gone, nothing to do
+ # Or it is a SSL stream
+ pass
+ await self.stream.aclose()
+
+ async def _update_keep_alive_timeout(self) -> None:
+ async with self.timeout_lock:
+ if self._keep_alive_timeout_handle is not None:
+ self._keep_alive_timeout_handle.cancel()
+ self._keep_alive_timeout_handle = None
+ if self.protocol.idle:
+ self._keep_alive_timeout_handle = await self.nursery.start(
+ _call_later, self.config.keep_alive_timeout, self._timeout
+ )
+
+ async def _timeout(self) -> None:
+ await self.protocol.handle(Closed())
+ await self.stream.aclose()
+
+
+async def _call_later(
+ timeout: float,
+ callback: Callable,
+ task_status: trio._core._run._TaskStatus = trio.TASK_STATUS_IGNORED,
+) -> None:
+ cancel_scope = trio.CancelScope()
+ task_status.started(cancel_scope)
+ with cancel_scope:
+ await trio.sleep(timeout)
+ cancel_scope.shield = True
+ await callback()
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/trio/udp_server.py b/.venv/lib/python3.9/site-packages/hypercorn/trio/udp_server.py
new file mode 100644
index 0000000..6050a89
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/trio/udp_server.py
@@ -0,0 +1,40 @@
+import trio
+
+from .context import Context
+from ..config import Config
+from ..events import Event, RawData
+from ..typing import ASGIFramework
+from ..utils import parse_socket_addr
+
+MAX_RECV = 2 ** 16
+
+
+class UDPServer:
+ def __init__(
+ self,
+ app: ASGIFramework,
+ config: Config,
+ socket: trio.socket.socket,
+ nursery: trio._core._run.Nursery,
+ ) -> None:
+ from ..protocol.quic import QuicProtocol # h3/Quic is an optional part of Hypercorn
+
+ self.app = app
+ self.config = config
+ self.nursery = nursery
+ self.socket = trio.socket.from_stdlib_socket(socket)
+ context = Context(nursery)
+ server = parse_socket_addr(socket.family, socket.getsockname())
+ self.protocol = QuicProtocol(self.app, self.config, context, server, self.protocol_send)
+
+ async def run(
+ self, task_status: trio._core._run._TaskStatus = trio.TASK_STATUS_IGNORED
+ ) -> None:
+ task_status.started()
+ while True:
+ data, address = await self.socket.recvfrom(MAX_RECV)
+ await self.protocol.handle(RawData(data=data, address=address))
+
+ async def protocol_send(self, event: Event) -> None:
+ if isinstance(event, RawData):
+ await self.socket.sendto(event.data, event.address)
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/typing.py b/.venv/lib/python3.9/site-packages/hypercorn/typing.py
new file mode 100644
index 0000000..a09e57b
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/typing.py
@@ -0,0 +1,308 @@
+from multiprocessing.synchronize import Event as EventType
+from typing import Any, Awaitable, Callable, Dict, Iterable, Optional, Tuple, Type, Union
+
+import h2.events
+import h11
+
+# Till PEP 544 is accepted
+try:
+ from typing import Literal, Protocol, TypedDict
+except ImportError:
+ from typing_extensions import Literal, Protocol, TypedDict # type: ignore
+
+from .config import Config, Sockets
+
+H11SendableEvent = Union[h11.Data, h11.EndOfMessage, h11.InformationalResponse, h11.Response]
+
+WorkerFunc = Callable[[Config, Optional[Sockets], Optional[EventType]], None]
+
+
+class ASGIVersions(TypedDict, total=False):
+ spec_version: str
+ version: Union[Literal["2.0"], Literal["3.0"]]
+
+
+class HTTPScope(TypedDict):
+ type: Literal["http"]
+ asgi: ASGIVersions
+ http_version: str
+ method: str
+ scheme: str
+ path: str
+ raw_path: bytes
+ query_string: bytes
+ root_path: str
+ headers: Iterable[Tuple[bytes, bytes]]
+ client: Optional[Tuple[str, int]]
+ server: Optional[Tuple[str, Optional[int]]]
+ extensions: Dict[str, dict]
+
+
+class WebsocketScope(TypedDict):
+ type: Literal["websocket"]
+ asgi: ASGIVersions
+ http_version: str
+ scheme: str
+ path: str
+ raw_path: bytes
+ query_string: bytes
+ root_path: str
+ headers: Iterable[Tuple[bytes, bytes]]
+ client: Optional[Tuple[str, int]]
+ server: Optional[Tuple[str, Optional[int]]]
+ subprotocols: Iterable[str]
+ extensions: Dict[str, dict]
+
+
+class LifespanScope(TypedDict):
+ type: Literal["lifespan"]
+ asgi: ASGIVersions
+
+
+WWWScope = Union[HTTPScope, WebsocketScope]
+Scope = Union[HTTPScope, WebsocketScope, LifespanScope]
+
+
+class HTTPRequestEvent(TypedDict):
+ type: Literal["http.request"]
+ body: bytes
+ more_body: bool
+
+
+class HTTPResponseStartEvent(TypedDict):
+ type: Literal["http.response.start"]
+ status: int
+ headers: Iterable[Tuple[bytes, bytes]]
+
+
+class HTTPResponseBodyEvent(TypedDict):
+ type: Literal["http.response.body"]
+ body: bytes
+ more_body: bool
+
+
+class HTTPServerPushEvent(TypedDict):
+ type: Literal["http.response.push"]
+ path: str
+ headers: Iterable[Tuple[bytes, bytes]]
+
+
+class HTTPDisconnectEvent(TypedDict):
+ type: Literal["http.disconnect"]
+
+
+class WebsocketConnectEvent(TypedDict):
+ type: Literal["websocket.connect"]
+
+
+class WebsocketAcceptEvent(TypedDict):
+ type: Literal["websocket.accept"]
+ subprotocol: Optional[str]
+ headers: Iterable[Tuple[bytes, bytes]]
+
+
+class WebsocketReceiveEvent(TypedDict):
+ type: Literal["websocket.receive"]
+ bytes: Optional[bytes]
+ text: Optional[str]
+
+
+class WebsocketSendEvent(TypedDict):
+ type: Literal["websocket.send"]
+ bytes: Optional[bytes]
+ text: Optional[str]
+
+
+class WebsocketResponseStartEvent(TypedDict):
+ type: Literal["websocket.http.response.start"]
+ status: int
+ headers: Iterable[Tuple[bytes, bytes]]
+
+
+class WebsocketResponseBodyEvent(TypedDict):
+ type: Literal["websocket.http.response.body"]
+ body: bytes
+ more_body: bool
+
+
+class WebsocketDisconnectEvent(TypedDict):
+ type: Literal["websocket.disconnect"]
+ code: int
+
+
+class WebsocketCloseEvent(TypedDict):
+ type: Literal["websocket.close"]
+ code: int
+
+
+class LifespanStartupEvent(TypedDict):
+ type: Literal["lifespan.startup"]
+
+
+class LifespanShutdownEvent(TypedDict):
+ type: Literal["lifespan.shutdown"]
+
+
+class LifespanStartupCompleteEvent(TypedDict):
+ type: Literal["lifespan.startup.complete"]
+
+
+class LifespanStartupFailedEvent(TypedDict):
+ type: Literal["lifespan.startup.failed"]
+ message: str
+
+
+class LifespanShutdownCompleteEvent(TypedDict):
+ type: Literal["lifespan.shutdown.complete"]
+
+
+class LifespanShutdownFailedEvent(TypedDict):
+ type: Literal["lifespan.shutdown.failed"]
+ message: str
+
+
+ASGIReceiveEvent = Union[
+ HTTPRequestEvent,
+ HTTPDisconnectEvent,
+ WebsocketConnectEvent,
+ WebsocketReceiveEvent,
+ WebsocketDisconnectEvent,
+ LifespanStartupEvent,
+ LifespanShutdownEvent,
+]
+
+
+ASGISendEvent = Union[
+ HTTPResponseStartEvent,
+ HTTPResponseBodyEvent,
+ HTTPServerPushEvent,
+ HTTPDisconnectEvent,
+ WebsocketAcceptEvent,
+ WebsocketSendEvent,
+ WebsocketResponseStartEvent,
+ WebsocketResponseBodyEvent,
+ WebsocketCloseEvent,
+ LifespanStartupCompleteEvent,
+ LifespanStartupFailedEvent,
+ LifespanShutdownCompleteEvent,
+ LifespanShutdownFailedEvent,
+]
+
+
+ASGIReceiveCallable = Callable[[], Awaitable[ASGIReceiveEvent]]
+ASGISendCallable = Callable[[ASGISendEvent], Awaitable[None]]
+
+
+class ASGI2Protocol(Protocol):
+ # Should replace with a Protocol when PEP 544 is accepted.
+
+ def __init__(self, scope: Scope) -> None:
+ ...
+
+ async def __call__(self, receive: ASGIReceiveCallable, send: ASGISendCallable) -> None:
+ ...
+
+
+ASGI2Framework = Type[ASGI2Protocol]
+ASGI3Framework = Callable[
+ [
+ Scope,
+ ASGIReceiveCallable,
+ ASGISendCallable,
+ ],
+ Awaitable[None],
+]
+ASGIFramework = Union[ASGI2Framework, ASGI3Framework]
+
+
+class H2SyncStream(Protocol):
+ scope: dict
+
+ def data_received(self, data: bytes) -> None:
+ ...
+
+ def ended(self) -> None:
+ ...
+
+ def reset(self) -> None:
+ ...
+
+ def close(self) -> None:
+ ...
+
+ async def handle_request(
+ self,
+ event: h2.events.RequestReceived,
+ scheme: str,
+ client: Tuple[str, int],
+ server: Tuple[str, int],
+ ) -> None:
+ ...
+
+
+class H2AsyncStream(Protocol):
+ scope: dict
+
+ async def data_received(self, data: bytes) -> None:
+ ...
+
+ async def ended(self) -> None:
+ ...
+
+ async def reset(self) -> None:
+ ...
+
+ async def close(self) -> None:
+ ...
+
+ async def handle_request(
+ self,
+ event: h2.events.RequestReceived,
+ scheme: str,
+ client: Tuple[str, int],
+ server: Tuple[str, int],
+ ) -> None:
+ ...
+
+
+class Event(Protocol):
+ def __init__(self) -> None:
+ ...
+
+ async def clear(self) -> None:
+ ...
+
+ async def set(self) -> None:
+ ...
+
+ async def wait(self) -> None:
+ ...
+
+
+class Context(Protocol):
+ event_class: Type[Event]
+
+ async def spawn_app(
+ self,
+ app: ASGIFramework,
+ config: Config,
+ scope: Scope,
+ send: Callable[[Optional[ASGISendEvent]], Awaitable[None]],
+ ) -> Callable[[ASGIReceiveEvent], Awaitable[None]]:
+ ...
+
+ def spawn(self, func: Callable, *args: Any) -> None:
+ ...
+
+ @staticmethod
+ async def sleep(wait: Union[float, int]) -> None:
+ ...
+
+ @staticmethod
+ def time() -> float:
+ ...
+
+
+class ResponseSummary(TypedDict):
+ status: int
+ headers: Iterable[Tuple[bytes, bytes]]
diff --git a/.venv/lib/python3.9/site-packages/hypercorn/utils.py b/.venv/lib/python3.9/site-packages/hypercorn/utils.py
new file mode 100644
index 0000000..279c2c3
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hypercorn/utils.py
@@ -0,0 +1,261 @@
+import inspect
+import os
+import platform
+import socket
+import sys
+from enum import Enum
+from importlib import import_module
+from multiprocessing.synchronize import Event as EventType
+from pathlib import Path
+from typing import (
+ Any,
+ Awaitable,
+ Callable,
+ cast,
+ Dict,
+ Iterable,
+ List,
+ Optional,
+ Tuple,
+ TYPE_CHECKING,
+)
+
+from .config import Config
+from .typing import (
+ ASGI2Framework,
+ ASGI3Framework,
+ ASGIFramework,
+ ASGIReceiveCallable,
+ ASGISendCallable,
+ Scope,
+)
+
+if TYPE_CHECKING:
+ from .protocol.events import Request
+
+
+class Shutdown(Exception):
+ pass
+
+
+class MustReloadException(Exception):
+ pass
+
+
+class NoAppException(Exception):
+ pass
+
+
+class LifespanTimeout(Exception):
+ def __init__(self, stage: str) -> None:
+ super().__init__(
+ f"Timeout whilst awaiting {stage}. Your application may not support the ASGI Lifespan "
+ f"protocol correctly, alternatively the {stage}_timeout configuration is incorrect."
+ )
+
+
+class LifespanFailure(Exception):
+ def __init__(self, stage: str, message: str) -> None:
+ super().__init__(f"Lifespan failure in {stage}. '{message}'")
+
+
+class UnexpectedMessage(Exception):
+ def __init__(self, state: Enum, message_type: str) -> None:
+ super().__init__(f"Unexpected message type, {message_type} given the state {state}")
+
+
+class FrameTooLarge(Exception):
+ pass
+
+
+def suppress_body(method: str, status_code: int) -> bool:
+ return method == "HEAD" or 100 <= status_code < 200 or status_code in {204, 304, 412}
+
+
+def build_and_validate_headers(headers: Iterable[Tuple[bytes, bytes]]) -> List[Tuple[bytes, bytes]]:
+ # Validates that the header name and value are bytes
+ validated_headers: List[Tuple[bytes, bytes]] = []
+ for name, value in headers:
+ if name[0] == b":"[0]:
+ raise ValueError("Pseudo headers are not valid")
+ validated_headers.append((bytes(name).lower().strip(), bytes(value).strip()))
+ return validated_headers
+
+
+def filter_pseudo_headers(headers: List[Tuple[bytes, bytes]]) -> List[Tuple[bytes, bytes]]:
+ filtered_headers: List[Tuple[bytes, bytes]] = [(b"host", b"")] # Placeholder
+ for name, value in headers:
+ if name == b":authority": # h2 & h3 libraries validate this is present
+ filtered_headers[0] = (b"host", value)
+ elif name[0] != b":"[0]:
+ filtered_headers.append((name, value))
+ return filtered_headers
+
+
+def load_application(path: str) -> ASGIFramework:
+ try:
+ module_name, app_name = path.split(":", 1)
+ except ValueError:
+ module_name, app_name = path, "app"
+ except AttributeError:
+ raise NoAppException()
+
+ module_path = Path(module_name).resolve()
+ sys.path.insert(0, str(module_path.parent))
+ if module_path.is_file():
+ import_name = module_path.with_suffix("").name
+ else:
+ import_name = module_path.name
+ try:
+ module = import_module(import_name)
+ except ModuleNotFoundError as error:
+ if error.name == import_name:
+ raise NoAppException()
+ else:
+ raise
+
+ try:
+ return eval(app_name, vars(module))
+ except NameError:
+ raise NoAppException()
+
+
+async def observe_changes(sleep: Callable[[float], Awaitable[Any]]) -> None:
+ last_updates: Dict[Path, float] = {}
+ for module in list(sys.modules.values()):
+ filename = getattr(module, "__file__", None)
+ if filename is None:
+ continue
+ path = Path(filename)
+ try:
+ last_updates[Path(filename)] = path.stat().st_mtime
+ except (FileNotFoundError, NotADirectoryError):
+ pass
+
+ while True:
+ await sleep(1)
+
+ for index, (path, last_mtime) in enumerate(last_updates.items()):
+ if index % 10 == 0:
+ # Yield to the event loop
+ await sleep(0)
+
+ try:
+ mtime = path.stat().st_mtime
+ except FileNotFoundError:
+ # File deleted
+ raise MustReloadException()
+ else:
+ if mtime > last_mtime:
+ raise MustReloadException()
+ else:
+ last_updates[path] = mtime
+
+
+def restart() -> None:
+ # Restart this process (only safe for dev/debug)
+ executable = sys.executable
+ script_path = Path(sys.argv[0]).resolve()
+ args = sys.argv[1:]
+ main_package = sys.modules["__main__"].__package__
+
+ if main_package is None:
+ # Executed by filename
+ if platform.system() == "Windows":
+ if not script_path.exists() and script_path.with_suffix(".exe").exists():
+ # quart run
+ executable = str(script_path.with_suffix(".exe"))
+ else:
+ # python run.py
+ args.append(str(script_path))
+ else:
+ if script_path.is_file() and os.access(script_path, os.X_OK):
+ # hypercorn run:app --reload
+ executable = str(script_path)
+ else:
+ # python run.py
+ args.append(str(script_path))
+ else:
+ # Executed as a module e.g. python -m run
+ module = script_path.stem
+ import_name = main_package
+ if module != "__main__":
+ import_name = f"{main_package}.{module}"
+ args[:0] = ["-m", import_name.lstrip(".")]
+
+ os.execv(executable, [executable] + args)
+
+
+async def raise_shutdown(shutdown_event: Callable[..., Awaitable[None]]) -> None:
+ await shutdown_event()
+ raise Shutdown()
+
+
+async def check_multiprocess_shutdown_event(
+ shutdown_event: EventType, sleep: Callable[[float], Awaitable[Any]]
+) -> None:
+ while True:
+ if shutdown_event.is_set():
+ return
+ await sleep(0.1)
+
+
+def write_pid_file(pid_path: str) -> None:
+ with open(pid_path, "w") as file_:
+ file_.write(f"{os.getpid()}")
+
+
+def parse_socket_addr(family: int, address: tuple) -> Optional[Tuple[str, int]]:
+ if family == socket.AF_INET:
+ return address # type: ignore
+ elif family == socket.AF_INET6:
+ return (address[0], address[1])
+ else:
+ return None
+
+
+def repr_socket_addr(family: int, address: tuple) -> str:
+ if family == socket.AF_INET:
+ return f"{address[0]}:{address[1]}"
+ elif family == socket.AF_INET6:
+ return f"[{address[0]}]:{address[1]}"
+ elif family == socket.AF_UNIX:
+ return f"unix:{address}"
+ else:
+ return f"{address}"
+
+
+async def invoke_asgi(
+ app: ASGIFramework, scope: Scope, receive: ASGIReceiveCallable, send: ASGISendCallable
+) -> None:
+ if _is_asgi_2(app):
+ scope["asgi"]["version"] = "2.0"
+ app = cast(ASGI2Framework, app)
+ asgi_instance = app(scope)
+ await asgi_instance(receive, send)
+ else:
+ scope["asgi"]["version"] = "3.0"
+ app = cast(ASGI3Framework, app)
+ await app(scope, receive, send)
+
+
+def _is_asgi_2(app: ASGIFramework) -> bool:
+ if inspect.isclass(app):
+ return True
+
+ if hasattr(app, "__call__") and inspect.iscoroutinefunction(app.__call__): # type: ignore
+ return False
+
+ return not inspect.iscoroutinefunction(app)
+
+
+def valid_server_name(config: Config, request: "Request") -> bool:
+ if len(config.server_names) == 0:
+ return True
+
+ host = ""
+ for name, value in request.headers:
+ if name.lower() == b"host":
+ host = value.decode()
+ break
+ return host in config.server_names
diff --git a/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/INSTALLER b/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/INSTALLER
new file mode 100644
index 0000000..a1b589e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/INSTALLER
@@ -0,0 +1 @@
+pip
diff --git a/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/LICENSE b/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/LICENSE
new file mode 100644
index 0000000..d24c351
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/LICENSE
@@ -0,0 +1,21 @@
+The MIT License (MIT)
+
+Copyright (c) 2014 Cory Benfield
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in
+all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+THE SOFTWARE.
diff --git a/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/METADATA b/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/METADATA
new file mode 100644
index 0000000..d919b90
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/METADATA
@@ -0,0 +1,73 @@
+Metadata-Version: 2.1
+Name: hyperframe
+Version: 6.0.0
+Summary: HTTP/2 framing layer for Python
+Home-page: https://github.com/python-hyper/hyperframe/
+Author: Cory Benfield
+Author-email: cory@lukasa.co.uk
+License: MIT License
+Platform: UNKNOWN
+Classifier: Development Status :: 5 - Production/Stable
+Classifier: Intended Audience :: Developers
+Classifier: License :: OSI Approved :: MIT License
+Classifier: Programming Language :: Python
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3.6
+Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Programming Language :: Python :: Implementation :: CPython
+Classifier: Programming Language :: Python :: Implementation :: PyPy
+Requires-Python: >=3.6.1
+Description-Content-Type: text/x-rst
+
+======================================
+hyperframe: Pure-Python HTTP/2 framing
+======================================
+
+.. image:: https://github.com/python-hyper/hyperframe/workflows/CI/badge.svg
+ :target: https://github.com/python-hyper/hyperframe/actions
+ :alt: Build Status
+.. image:: https://codecov.io/gh/python-hyper/hyperframe/branch/master/graph/badge.svg
+ :target: https://codecov.io/gh/python-hyper/hyperframe
+ :alt: Code Coverage
+.. image:: https://readthedocs.org/projects/hyperframe/badge/?version=latest
+ :target: https://hyperframe.readthedocs.io/en/latest/
+ :alt: Documentation Status
+.. image:: https://img.shields.io/badge/chat-join_now-brightgreen.svg
+ :target: https://gitter.im/python-hyper/community
+ :alt: Chat community
+
+This library contains the HTTP/2 framing code used in the `hyper`_ project. It
+provides a pure-Python codebase that is capable of decoding a binary stream
+into HTTP/2 frames.
+
+This library is used directly by `hyper`_ and a number of other projects to
+provide HTTP/2 frame decoding logic.
+
+Contributing
+============
+
+hyperframe welcomes contributions from anyone! Unlike many other projects we
+are happy to accept cosmetic contributions and small contributions, in addition
+to large feature requests and changes.
+
+Before you contribute (either by opening an issue or filing a pull request),
+please `read the contribution guidelines`_.
+
+.. _read the contribution guidelines: http://hyper.readthedocs.org/en/development/contributing.html
+
+License
+=======
+
+hyperframe is made available under the MIT License. For more details, see the
+``LICENSE`` file in the repository.
+
+Authors
+=======
+
+hyperframe is maintained by Cory Benfield, with contributions from others. For
+more details about the contributors, please see ``CONTRIBUTORS.rst``.
+
+.. _hyper: http://python-hyper.org/
+
+
diff --git a/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/RECORD b/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/RECORD
new file mode 100644
index 0000000..fceea3b
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/RECORD
@@ -0,0 +1,14 @@
+hyperframe-6.0.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
+hyperframe-6.0.0.dist-info/LICENSE,sha256=djqTQqBN9iBGydx0ilKHk06wpTMcaGOzygruIOGMtO0,1080
+hyperframe-6.0.0.dist-info/METADATA,sha256=pp17DSijyMmTvxYVfhKzplKyZcv_uyX1SfvZq5kN0Dc,2681
+hyperframe-6.0.0.dist-info/RECORD,,
+hyperframe-6.0.0.dist-info/WHEEL,sha256=g4nMs7d-Xl9-xC9XovUrsDHGXt-FT0E17Yqo92DEfvY,92
+hyperframe-6.0.0.dist-info/top_level.txt,sha256=aIXWNxzKF_jwE8lyWG5Paqn5RP7PDBYeguraia-oHJE,11
+hyperframe/__init__.py,sha256=sUjhICYzd6Cuvm_Io6AbRRwKpv7jTsB8XPEw54EtKjo,136
+hyperframe/__pycache__/__init__.cpython-39.pyc,,
+hyperframe/__pycache__/exceptions.cpython-39.pyc,,
+hyperframe/__pycache__/flags.cpython-39.pyc,,
+hyperframe/__pycache__/frame.cpython-39.pyc,,
+hyperframe/exceptions.py,sha256=9RlW8j73JBHkK6fX9vkfFHehDqd_xF5JWDyLWgh7bHM,1582
+hyperframe/flags.py,sha256=mW74MrMEiyEt6p_Z9tw19MPf9c9Bnih6YuKJiJBHu-4,1238
+hyperframe/frame.py,sha256=FFuNpt_J8ps-Ig1NDCV3Dzd-_9UC7LYcX1oe9edr9uA,30585
diff --git a/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/WHEEL b/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/WHEEL
new file mode 100644
index 0000000..b552003
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/WHEEL
@@ -0,0 +1,5 @@
+Wheel-Version: 1.0
+Generator: bdist_wheel (0.34.2)
+Root-Is-Purelib: true
+Tag: py3-none-any
+
diff --git a/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/top_level.txt b/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/top_level.txt
new file mode 100644
index 0000000..b21bb7c
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hyperframe-6.0.0.dist-info/top_level.txt
@@ -0,0 +1 @@
+hyperframe
diff --git a/.venv/lib/python3.9/site-packages/hyperframe/__init__.py b/.venv/lib/python3.9/site-packages/hyperframe/__init__.py
new file mode 100644
index 0000000..885a77e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hyperframe/__init__.py
@@ -0,0 +1,8 @@
+# -*- coding: utf-8 -*-
+"""
+hyperframe
+~~~~~~~~~~
+
+A module for providing a pure-Python HTTP/2 framing layer.
+"""
+__version__ = '6.0.0'
diff --git a/.venv/lib/python3.9/site-packages/hyperframe/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hyperframe/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..198a574
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hyperframe/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hyperframe/__pycache__/exceptions.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hyperframe/__pycache__/exceptions.cpython-39.pyc
new file mode 100644
index 0000000..c662a45
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hyperframe/__pycache__/exceptions.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hyperframe/__pycache__/flags.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hyperframe/__pycache__/flags.cpython-39.pyc
new file mode 100644
index 0000000..bbdb5f8
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hyperframe/__pycache__/flags.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hyperframe/__pycache__/frame.cpython-39.pyc b/.venv/lib/python3.9/site-packages/hyperframe/__pycache__/frame.cpython-39.pyc
new file mode 100644
index 0000000..dd63ac8
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/hyperframe/__pycache__/frame.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/hyperframe/exceptions.py b/.venv/lib/python3.9/site-packages/hyperframe/exceptions.py
new file mode 100644
index 0000000..3d41468
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hyperframe/exceptions.py
@@ -0,0 +1,67 @@
+# -*- coding: utf-8 -*-
+"""
+hyperframe/exceptions
+~~~~~~~~~~~~~~~~~~~~~
+
+Defines the exceptions that can be thrown by hyperframe.
+"""
+
+
+class HyperframeError(Exception):
+ """
+ The base class for all exceptions for the hyperframe module.
+
+ .. versionadded:: 6.0.0
+ """
+
+
+class UnknownFrameError(HyperframeError):
+ """
+ A frame of unknown type was received.
+
+ .. versionchanged:: 6.0.0
+ Changed base class from `ValueError` to :class:`HyperframeError`
+ """
+ def __init__(self, frame_type, length):
+ #: The type byte of the unknown frame that was received.
+ self.frame_type = frame_type
+
+ #: The length of the data portion of the unknown frame.
+ self.length = length
+
+ def __str__(self):
+ return (
+ "UnknownFrameError: Unknown frame type 0x%X received, "
+ "length %d bytes" % (self.frame_type, self.length)
+ )
+
+
+class InvalidPaddingError(HyperframeError):
+ """
+ A frame with invalid padding was received.
+
+ .. versionchanged:: 6.0.0
+ Changed base class from `ValueError` to :class:`HyperframeError`
+ """
+ pass
+
+
+class InvalidFrameError(HyperframeError):
+ """
+ Parsing a frame failed because the data was not laid out appropriately.
+
+ .. versionadded:: 3.0.2
+
+ .. versionchanged:: 6.0.0
+ Changed base class from `ValueError` to :class:`HyperframeError`
+ """
+ pass
+
+
+class InvalidDataError(HyperframeError):
+ """
+ Content or data of a frame was is invalid or violates the specification.
+
+ .. versionadded:: 6.0.0
+ """
+ pass
diff --git a/.venv/lib/python3.9/site-packages/hyperframe/flags.py b/.venv/lib/python3.9/site-packages/hyperframe/flags.py
new file mode 100644
index 0000000..f3933f8
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hyperframe/flags.py
@@ -0,0 +1,48 @@
+# -*- coding: utf-8 -*-
+"""
+hyperframe/flags
+~~~~~~~~~~~~~~~~
+
+Defines basic Flag and Flags data structures.
+"""
+import collections
+from collections.abc import MutableSet
+
+Flag = collections.namedtuple("Flag", ["name", "bit"])
+
+
+class Flags(MutableSet):
+ """
+ A simple MutableSet implementation that will only accept known flags as
+ elements.
+
+ Will behave like a regular set(), except that a ValueError will be thrown
+ when .add()ing unexpected flags.
+ """
+ def __init__(self, defined_flags):
+ self._valid_flags = set(flag.name for flag in defined_flags)
+ self._flags = set()
+
+ def __repr__(self):
+ return repr(sorted(list(self._flags)))
+
+ def __contains__(self, x):
+ return self._flags.__contains__(x)
+
+ def __iter__(self):
+ return self._flags.__iter__()
+
+ def __len__(self):
+ return self._flags.__len__()
+
+ def discard(self, value):
+ return self._flags.discard(value)
+
+ def add(self, value):
+ if value not in self._valid_flags:
+ raise ValueError(
+ "Unexpected flag: {}. Valid flags are: {}".format(
+ value, self._valid_flags
+ )
+ )
+ return self._flags.add(value)
diff --git a/.venv/lib/python3.9/site-packages/hyperframe/frame.py b/.venv/lib/python3.9/site-packages/hyperframe/frame.py
new file mode 100644
index 0000000..d9b3ecf
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/hyperframe/frame.py
@@ -0,0 +1,968 @@
+# -*- coding: utf-8 -*-
+"""
+hyperframe/frame
+~~~~~~~~~~~~~~~~
+
+Defines framing logic for HTTP/2. Provides both classes to represent framed
+data and logic for aiding the connection when it comes to reading from the
+socket.
+"""
+import struct
+import binascii
+
+from .exceptions import (
+ UnknownFrameError, InvalidPaddingError, InvalidFrameError, InvalidDataError
+)
+from .flags import Flag, Flags
+
+
+# The maximum initial length of a frame. Some frames have shorter maximum
+# lengths.
+FRAME_MAX_LEN = (2 ** 14)
+
+# The maximum allowed length of a frame.
+FRAME_MAX_ALLOWED_LEN = (2 ** 24) - 1
+
+# Stream association enumerations.
+_STREAM_ASSOC_HAS_STREAM = "has-stream"
+_STREAM_ASSOC_NO_STREAM = "no-stream"
+_STREAM_ASSOC_EITHER = "either"
+
+# Structs for packing and unpacking
+_STRUCT_HBBBL = struct.Struct(">HBBBL")
+_STRUCT_LL = struct.Struct(">LL")
+_STRUCT_HL = struct.Struct(">HL")
+_STRUCT_LB = struct.Struct(">LB")
+_STRUCT_L = struct.Struct(">L")
+_STRUCT_H = struct.Struct(">H")
+_STRUCT_B = struct.Struct(">B")
+
+
+class Frame:
+ """
+ The base class for all HTTP/2 frames.
+ """
+ #: The flags defined on this type of frame.
+ defined_flags = []
+
+ #: The byte used to define the type of the frame.
+ type = None
+
+ # If 'has-stream', the frame's stream_id must be non-zero. If 'no-stream',
+ # it must be zero. If 'either', it's not checked.
+ stream_association = None
+
+ def __init__(self, stream_id, flags=()):
+ #: The stream identifier for the stream this frame was received on.
+ #: Set to 0 for frames sent on the connection (stream-id 0).
+ self.stream_id = stream_id
+
+ #: The flags set for this frame.
+ self.flags = Flags(self.defined_flags)
+
+ #: The frame length, excluding the nine-byte header.
+ self.body_len = 0
+
+ for flag in flags:
+ self.flags.add(flag)
+
+ if (not self.stream_id and
+ self.stream_association == _STREAM_ASSOC_HAS_STREAM):
+ raise InvalidDataError(
+ 'Stream ID must be non-zero for {}'.format(
+ type(self).__name__,
+ )
+ )
+ if (self.stream_id and
+ self.stream_association == _STREAM_ASSOC_NO_STREAM):
+ raise InvalidDataError(
+ 'Stream ID must be zero for {} with stream_id={}'.format(
+ type(self).__name__,
+ self.stream_id,
+ )
+ )
+
+ def __repr__(self):
+ return (
+ "{}(stream_id={}, flags={}): {}"
+ ).format(
+ type(self).__name__,
+ self.stream_id,
+ repr(self.flags),
+ self._body_repr(),
+ )
+
+ def _body_repr(self):
+ # More specific implementation may be provided by subclasses of Frame.
+ # This fallback shows the serialized (and truncated) body content.
+ return _raw_data_repr(self.serialize_body())
+
+ @staticmethod
+ def explain(data):
+ """
+ Takes a bytestring and tries to parse a single frame and print it.
+
+ This function is only provided for debugging purposes.
+
+ :param data: A memoryview object containing the raw data of at least
+ one complete frame (header and body).
+
+ .. versionadded:: 6.0.0
+ """
+ frame, length = Frame.parse_frame_header(data[:9])
+ frame.parse_body(data[9:9 + length])
+ print(frame)
+ return frame, length
+
+ @staticmethod
+ def parse_frame_header(header, strict=False):
+ """
+ Takes a 9-byte frame header and returns a tuple of the appropriate
+ Frame object and the length that needs to be read from the socket.
+
+ This populates the flags field, and determines how long the body is.
+
+ :param header: A memoryview object containing the 9-byte frame header
+ data of a frame. Must not contain more or less.
+
+ :param strict: Whether to raise an exception when encountering a frame
+ not defined by spec and implemented by hyperframe.
+
+ :raises hyperframe.exceptions.UnknownFrameError: If a frame of unknown
+ type is received.
+
+ .. versionchanged:: 5.0.0
+ Added :param:`strict` to accommodate :class:`ExtensionFrame`
+ """
+ try:
+ fields = _STRUCT_HBBBL.unpack(header)
+ except struct.error:
+ raise InvalidFrameError("Invalid frame header")
+
+ # First 24 bits are frame length.
+ length = (fields[0] << 8) + fields[1]
+ type = fields[2]
+ flags = fields[3]
+ stream_id = fields[4] & 0x7FFFFFFF
+
+ try:
+ frame = FRAMES[type](stream_id)
+ except KeyError:
+ if strict:
+ raise UnknownFrameError(type, length)
+ frame = ExtensionFrame(type=type, stream_id=stream_id)
+
+ frame.parse_flags(flags)
+ return (frame, length)
+
+ def parse_flags(self, flag_byte):
+ for flag, flag_bit in self.defined_flags:
+ if flag_byte & flag_bit:
+ self.flags.add(flag)
+
+ return self.flags
+
+ def serialize(self):
+ """
+ Convert a frame into a bytestring, representing the serialized form of
+ the frame.
+ """
+ body = self.serialize_body()
+ self.body_len = len(body)
+
+ # Build the common frame header.
+ # First, get the flags.
+ flags = 0
+
+ for flag, flag_bit in self.defined_flags:
+ if flag in self.flags:
+ flags |= flag_bit
+
+ header = _STRUCT_HBBBL.pack(
+ (self.body_len >> 8) & 0xFFFF, # Length spread over top 24 bits
+ self.body_len & 0xFF,
+ self.type,
+ flags,
+ self.stream_id & 0x7FFFFFFF # Stream ID is 32 bits.
+ )
+
+ return header + body
+
+ def serialize_body(self):
+ raise NotImplementedError()
+
+ def parse_body(self, data):
+ """
+ Given the body of a frame, parses it into frame data. This populates
+ the non-header parts of the frame: that is, it does not populate the
+ stream ID or flags.
+
+ :param data: A memoryview object containing the body data of the frame.
+ Must not contain *more* data than the length returned by
+ :meth:`parse_frame_header
+ `.
+ """
+ raise NotImplementedError()
+
+
+class Padding:
+ """
+ Mixin for frames that contain padding. Defines extra fields that can be
+ used and set by frames that can be padded.
+ """
+ def __init__(self, stream_id, pad_length=0, **kwargs):
+ super().__init__(stream_id, **kwargs)
+
+ #: The length of the padding to use.
+ self.pad_length = pad_length
+
+ def serialize_padding_data(self):
+ if 'PADDED' in self.flags:
+ return _STRUCT_B.pack(self.pad_length)
+ return b''
+
+ def parse_padding_data(self, data):
+ if 'PADDED' in self.flags:
+ try:
+ self.pad_length = struct.unpack('!B', data[:1])[0]
+ except struct.error:
+ raise InvalidFrameError("Invalid Padding data")
+ return 1
+ return 0
+
+ #: .. deprecated:: 5.2.1
+ #: Use self.pad_length instead.
+ @property
+ def total_padding(self): # pragma: no cover
+ import warnings
+ warnings.warn(
+ "total_padding contains the same information as pad_length.",
+ DeprecationWarning
+ )
+ return self.pad_length
+
+
+class Priority:
+ """
+ Mixin for frames that contain priority data. Defines extra fields that can
+ be used and set by frames that contain priority data.
+ """
+ def __init__(self,
+ stream_id,
+ depends_on=0x0,
+ stream_weight=0x0,
+ exclusive=False,
+ **kwargs):
+ super().__init__(stream_id, **kwargs)
+
+ #: The stream ID of the stream on which this stream depends.
+ self.depends_on = depends_on
+
+ #: The weight of the stream. This is an integer between 0 and 256.
+ self.stream_weight = stream_weight
+
+ #: Whether the exclusive bit was set.
+ self.exclusive = exclusive
+
+ def serialize_priority_data(self):
+ return _STRUCT_LB.pack(
+ self.depends_on + (0x80000000 if self.exclusive else 0),
+ self.stream_weight
+ )
+
+ def parse_priority_data(self, data):
+ try:
+ self.depends_on, self.stream_weight = _STRUCT_LB.unpack(data[:5])
+ except struct.error:
+ raise InvalidFrameError("Invalid Priority data")
+
+ self.exclusive = True if self.depends_on >> 31 else False
+ self.depends_on &= 0x7FFFFFFF
+ return 5
+
+
+class DataFrame(Padding, Frame):
+ """
+ DATA frames convey arbitrary, variable-length sequences of octets
+ associated with a stream. One or more DATA frames are used, for instance,
+ to carry HTTP request or response payloads.
+ """
+ #: The flags defined for DATA frames.
+ defined_flags = [
+ Flag('END_STREAM', 0x01),
+ Flag('PADDED', 0x08),
+ ]
+
+ #: The type byte for data frames.
+ type = 0x0
+
+ stream_association = _STREAM_ASSOC_HAS_STREAM
+
+ def __init__(self, stream_id, data=b'', **kwargs):
+ super().__init__(stream_id, **kwargs)
+
+ #: The data contained on this frame.
+ self.data = data
+
+ def serialize_body(self):
+ padding_data = self.serialize_padding_data()
+ padding = b'\0' * self.pad_length
+ if isinstance(self.data, memoryview):
+ self.data = self.data.tobytes()
+ return b''.join([padding_data, self.data, padding])
+
+ def parse_body(self, data):
+ padding_data_length = self.parse_padding_data(data)
+ self.data = (
+ data[padding_data_length:len(data)-self.pad_length].tobytes()
+ )
+ self.body_len = len(data)
+
+ if self.pad_length and self.pad_length >= self.body_len:
+ raise InvalidPaddingError("Padding is too long.")
+
+ @property
+ def flow_controlled_length(self):
+ """
+ The length of the frame that needs to be accounted for when considering
+ flow control.
+ """
+ padding_len = 0
+ if 'PADDED' in self.flags:
+ # Account for extra 1-byte padding length field, which is still
+ # present if possibly zero-valued.
+ padding_len = self.pad_length + 1
+ return len(self.data) + padding_len
+
+
+class PriorityFrame(Priority, Frame):
+ """
+ The PRIORITY frame specifies the sender-advised priority of a stream. It
+ can be sent at any time for an existing stream. This enables
+ reprioritisation of existing streams.
+ """
+ #: The flags defined for PRIORITY frames.
+ defined_flags = []
+
+ #: The type byte defined for PRIORITY frames.
+ type = 0x02
+
+ stream_association = _STREAM_ASSOC_HAS_STREAM
+
+ def _body_repr(self):
+ return "exclusive={}, depends_on={}, stream_weight={}".format(
+ self.exclusive,
+ self.depends_on,
+ self.stream_weight
+ )
+
+ def serialize_body(self):
+ return self.serialize_priority_data()
+
+ def parse_body(self, data):
+ if len(data) > 5:
+ raise InvalidFrameError(
+ "PRIORITY must have 5 byte body: actual length %s." %
+ len(data)
+ )
+
+ self.parse_priority_data(data)
+ self.body_len = 5
+
+
+class RstStreamFrame(Frame):
+ """
+ The RST_STREAM frame allows for abnormal termination of a stream. When sent
+ by the initiator of a stream, it indicates that they wish to cancel the
+ stream or that an error condition has occurred. When sent by the receiver
+ of a stream, it indicates that either the receiver is rejecting the stream,
+ requesting that the stream be cancelled or that an error condition has
+ occurred.
+ """
+ #: The flags defined for RST_STREAM frames.
+ defined_flags = []
+
+ #: The type byte defined for RST_STREAM frames.
+ type = 0x03
+
+ stream_association = _STREAM_ASSOC_HAS_STREAM
+
+ def __init__(self, stream_id, error_code=0, **kwargs):
+ super().__init__(stream_id, **kwargs)
+
+ #: The error code used when resetting the stream.
+ self.error_code = error_code
+
+ def _body_repr(self):
+ return "error_code={}".format(
+ self.error_code,
+ )
+
+ def serialize_body(self):
+ return _STRUCT_L.pack(self.error_code)
+
+ def parse_body(self, data):
+ if len(data) != 4:
+ raise InvalidFrameError(
+ "RST_STREAM must have 4 byte body: actual length %s." %
+ len(data)
+ )
+
+ try:
+ self.error_code = _STRUCT_L.unpack(data)[0]
+ except struct.error: # pragma: no cover
+ raise InvalidFrameError("Invalid RST_STREAM body")
+
+ self.body_len = 4
+
+
+class SettingsFrame(Frame):
+ """
+ The SETTINGS frame conveys configuration parameters that affect how
+ endpoints communicate. The parameters are either constraints on peer
+ behavior or preferences.
+
+ Settings are not negotiated. Settings describe characteristics of the
+ sending peer, which are used by the receiving peer. Different values for
+ the same setting can be advertised by each peer. For example, a client
+ might set a high initial flow control window, whereas a server might set a
+ lower value to conserve resources.
+ """
+ #: The flags defined for SETTINGS frames.
+ defined_flags = [Flag('ACK', 0x01)]
+
+ #: The type byte defined for SETTINGS frames.
+ type = 0x04
+
+ stream_association = _STREAM_ASSOC_NO_STREAM
+
+ # We need to define the known settings, they may as well be class
+ # attributes.
+ #: The byte that signals the SETTINGS_HEADER_TABLE_SIZE setting.
+ HEADER_TABLE_SIZE = 0x01
+ #: The byte that signals the SETTINGS_ENABLE_PUSH setting.
+ ENABLE_PUSH = 0x02
+ #: The byte that signals the SETTINGS_MAX_CONCURRENT_STREAMS setting.
+ MAX_CONCURRENT_STREAMS = 0x03
+ #: The byte that signals the SETTINGS_INITIAL_WINDOW_SIZE setting.
+ INITIAL_WINDOW_SIZE = 0x04
+ #: The byte that signals the SETTINGS_MAX_FRAME_SIZE setting.
+ MAX_FRAME_SIZE = 0x05
+ #: The byte that signals the SETTINGS_MAX_HEADER_LIST_SIZE setting.
+ MAX_HEADER_LIST_SIZE = 0x06
+ #: The byte that signals SETTINGS_ENABLE_CONNECT_PROTOCOL setting.
+ ENABLE_CONNECT_PROTOCOL = 0x08
+
+ def __init__(self, stream_id=0, settings=None, **kwargs):
+ super().__init__(stream_id, **kwargs)
+
+ if settings and "ACK" in kwargs.get("flags", ()):
+ raise InvalidDataError(
+ "Settings must be empty if ACK flag is set."
+ )
+
+ #: A dictionary of the setting type byte to the value of the setting.
+ self.settings = settings or {}
+
+ def _body_repr(self):
+ return "settings={}".format(
+ self.settings,
+ )
+
+ def serialize_body(self):
+ return b''.join([_STRUCT_HL.pack(setting & 0xFF, value)
+ for setting, value in self.settings.items()])
+
+ def parse_body(self, data):
+ if 'ACK' in self.flags and len(data) > 0:
+ raise InvalidDataError(
+ "SETTINGS ack frame must not have payload: got %s bytes" %
+ len(data)
+ )
+
+ body_len = 0
+ for i in range(0, len(data), 6):
+ try:
+ name, value = _STRUCT_HL.unpack(data[i:i+6])
+ except struct.error:
+ raise InvalidFrameError("Invalid SETTINGS body")
+
+ self.settings[name] = value
+ body_len += 6
+
+ self.body_len = body_len
+
+
+class PushPromiseFrame(Padding, Frame):
+ """
+ The PUSH_PROMISE frame is used to notify the peer endpoint in advance of
+ streams the sender intends to initiate.
+ """
+ #: The flags defined for PUSH_PROMISE frames.
+ defined_flags = [
+ Flag('END_HEADERS', 0x04),
+ Flag('PADDED', 0x08)
+ ]
+
+ #: The type byte defined for PUSH_PROMISE frames.
+ type = 0x05
+
+ stream_association = _STREAM_ASSOC_HAS_STREAM
+
+ def __init__(self, stream_id, promised_stream_id=0, data=b'', **kwargs):
+ super().__init__(stream_id, **kwargs)
+
+ #: The stream ID that is promised by this frame.
+ self.promised_stream_id = promised_stream_id
+
+ #: The HPACK-encoded header block for the simulated request on the new
+ #: stream.
+ self.data = data
+
+ def _body_repr(self):
+ return "promised_stream_id={}, data={}".format(
+ self.promised_stream_id,
+ _raw_data_repr(self.data),
+ )
+
+ def serialize_body(self):
+ padding_data = self.serialize_padding_data()
+ padding = b'\0' * self.pad_length
+ data = _STRUCT_L.pack(self.promised_stream_id)
+ return b''.join([padding_data, data, self.data, padding])
+
+ def parse_body(self, data):
+ padding_data_length = self.parse_padding_data(data)
+
+ try:
+ self.promised_stream_id = _STRUCT_L.unpack(
+ data[padding_data_length:padding_data_length + 4]
+ )[0]
+ except struct.error:
+ raise InvalidFrameError("Invalid PUSH_PROMISE body")
+
+ self.data = (
+ data[padding_data_length + 4:len(data)-self.pad_length].tobytes()
+ )
+ self.body_len = len(data)
+
+ if self.promised_stream_id == 0 or self.promised_stream_id % 2 != 0:
+ raise InvalidDataError(
+ "Invalid PUSH_PROMISE promised stream id: %s" %
+ self.promised_stream_id
+ )
+
+ if self.pad_length and self.pad_length >= self.body_len:
+ raise InvalidPaddingError("Padding is too long.")
+
+
+class PingFrame(Frame):
+ """
+ The PING frame is a mechanism for measuring a minimal round-trip time from
+ the sender, as well as determining whether an idle connection is still
+ functional. PING frames can be sent from any endpoint.
+ """
+ #: The flags defined for PING frames.
+ defined_flags = [Flag('ACK', 0x01)]
+
+ #: The type byte defined for PING frames.
+ type = 0x06
+
+ stream_association = _STREAM_ASSOC_NO_STREAM
+
+ def __init__(self, stream_id=0, opaque_data=b'', **kwargs):
+ super().__init__(stream_id, **kwargs)
+
+ #: The opaque data sent in this PING frame, as a bytestring.
+ self.opaque_data = opaque_data
+
+ def _body_repr(self):
+ return "opaque_data={}".format(
+ self.opaque_data,
+ )
+
+ def serialize_body(self):
+ if len(self.opaque_data) > 8:
+ raise InvalidFrameError(
+ "PING frame may not have more than 8 bytes of data, got %s" %
+ self.opaque_data
+ )
+
+ data = self.opaque_data
+ data += b'\x00' * (8 - len(self.opaque_data))
+ return data
+
+ def parse_body(self, data):
+ if len(data) != 8:
+ raise InvalidFrameError(
+ "PING frame must have 8 byte length: got %s" % len(data)
+ )
+
+ self.opaque_data = data.tobytes()
+ self.body_len = 8
+
+
+class GoAwayFrame(Frame):
+ """
+ The GOAWAY frame informs the remote peer to stop creating streams on this
+ connection. It can be sent from the client or the server. Once sent, the
+ sender will ignore frames sent on new streams for the remainder of the
+ connection.
+ """
+ #: The flags defined for GOAWAY frames.
+ defined_flags = []
+
+ #: The type byte defined for GOAWAY frames.
+ type = 0x07
+
+ stream_association = _STREAM_ASSOC_NO_STREAM
+
+ def __init__(self,
+ stream_id=0,
+ last_stream_id=0,
+ error_code=0,
+ additional_data=b'',
+ **kwargs):
+ super().__init__(stream_id, **kwargs)
+
+ #: The last stream ID definitely seen by the remote peer.
+ self.last_stream_id = last_stream_id
+
+ #: The error code for connection teardown.
+ self.error_code = error_code
+
+ #: Any additional data sent in the GOAWAY.
+ self.additional_data = additional_data
+
+ def _body_repr(self):
+ return "last_stream_id={}, error_code={}, additional_data={}".format(
+ self.last_stream_id,
+ self.error_code,
+ self.additional_data,
+ )
+
+ def serialize_body(self):
+ data = _STRUCT_LL.pack(
+ self.last_stream_id & 0x7FFFFFFF,
+ self.error_code
+ )
+ data += self.additional_data
+
+ return data
+
+ def parse_body(self, data):
+ try:
+ self.last_stream_id, self.error_code = _STRUCT_LL.unpack(
+ data[:8]
+ )
+ except struct.error:
+ raise InvalidFrameError("Invalid GOAWAY body.")
+
+ self.body_len = len(data)
+
+ if len(data) > 8:
+ self.additional_data = data[8:].tobytes()
+
+
+class WindowUpdateFrame(Frame):
+ """
+ The WINDOW_UPDATE frame is used to implement flow control.
+
+ Flow control operates at two levels: on each individual stream and on the
+ entire connection.
+
+ Both types of flow control are hop by hop; that is, only between the two
+ endpoints. Intermediaries do not forward WINDOW_UPDATE frames between
+ dependent connections. However, throttling of data transfer by any receiver
+ can indirectly cause the propagation of flow control information toward the
+ original sender.
+ """
+ #: The flags defined for WINDOW_UPDATE frames.
+ defined_flags = []
+
+ #: The type byte defined for WINDOW_UPDATE frames.
+ type = 0x08
+
+ stream_association = _STREAM_ASSOC_EITHER
+
+ def __init__(self, stream_id, window_increment=0, **kwargs):
+ super().__init__(stream_id, **kwargs)
+
+ #: The amount the flow control window is to be incremented.
+ self.window_increment = window_increment
+
+ def _body_repr(self):
+ return "window_increment={}".format(
+ self.window_increment,
+ )
+
+ def serialize_body(self):
+ return _STRUCT_L.pack(self.window_increment & 0x7FFFFFFF)
+
+ def parse_body(self, data):
+ if len(data) > 4:
+ raise InvalidFrameError(
+ "WINDOW_UPDATE frame must have 4 byte length: got %s" %
+ len(data)
+ )
+
+ try:
+ self.window_increment = _STRUCT_L.unpack(data)[0]
+ except struct.error:
+ raise InvalidFrameError("Invalid WINDOW_UPDATE body")
+
+ if not 1 <= self.window_increment <= 2**31-1:
+ raise InvalidDataError(
+ "WINDOW_UPDATE increment must be between 1 to 2^31-1"
+ )
+
+ self.body_len = 4
+
+
+class HeadersFrame(Padding, Priority, Frame):
+ """
+ The HEADERS frame carries name-value pairs. It is used to open a stream.
+ HEADERS frames can be sent on a stream in the "open" or "half closed
+ (remote)" states.
+
+ The HeadersFrame class is actually basically a data frame in this
+ implementation, because of the requirement to control the sizes of frames.
+ A header block fragment that doesn't fit in an entire HEADERS frame needs
+ to be followed with CONTINUATION frames. From the perspective of the frame
+ building code the header block is an opaque data segment.
+ """
+ #: The flags defined for HEADERS frames.
+ defined_flags = [
+ Flag('END_STREAM', 0x01),
+ Flag('END_HEADERS', 0x04),
+ Flag('PADDED', 0x08),
+ Flag('PRIORITY', 0x20),
+ ]
+
+ #: The type byte defined for HEADERS frames.
+ type = 0x01
+
+ stream_association = _STREAM_ASSOC_HAS_STREAM
+
+ def __init__(self, stream_id, data=b'', **kwargs):
+ super().__init__(stream_id, **kwargs)
+
+ #: The HPACK-encoded header block.
+ self.data = data
+
+ def _body_repr(self):
+ return "exclusive={}, depends_on={}, stream_weight={}, data={}".format(
+ self.exclusive,
+ self.depends_on,
+ self.stream_weight,
+ _raw_data_repr(self.data),
+ )
+
+ def serialize_body(self):
+ padding_data = self.serialize_padding_data()
+ padding = b'\0' * self.pad_length
+
+ if 'PRIORITY' in self.flags:
+ priority_data = self.serialize_priority_data()
+ else:
+ priority_data = b''
+
+ return b''.join([padding_data, priority_data, self.data, padding])
+
+ def parse_body(self, data):
+ padding_data_length = self.parse_padding_data(data)
+ data = data[padding_data_length:]
+
+ if 'PRIORITY' in self.flags:
+ priority_data_length = self.parse_priority_data(data)
+ else:
+ priority_data_length = 0
+
+ self.body_len = len(data)
+ self.data = (
+ data[priority_data_length:len(data)-self.pad_length].tobytes()
+ )
+
+ if self.pad_length and self.pad_length >= self.body_len:
+ raise InvalidPaddingError("Padding is too long.")
+
+
+class ContinuationFrame(Frame):
+ """
+ The CONTINUATION frame is used to continue a sequence of header block
+ fragments. Any number of CONTINUATION frames can be sent on an existing
+ stream, as long as the preceding frame on the same stream is one of
+ HEADERS, PUSH_PROMISE or CONTINUATION without the END_HEADERS flag set.
+
+ Much like the HEADERS frame, hyper treats this as an opaque data frame with
+ different flags and a different type.
+ """
+ #: The flags defined for CONTINUATION frames.
+ defined_flags = [Flag('END_HEADERS', 0x04)]
+
+ #: The type byte defined for CONTINUATION frames.
+ type = 0x09
+
+ stream_association = _STREAM_ASSOC_HAS_STREAM
+
+ def __init__(self, stream_id, data=b'', **kwargs):
+ super().__init__(stream_id, **kwargs)
+
+ #: The HPACK-encoded header block.
+ self.data = data
+
+ def _body_repr(self):
+ return "data={}".format(
+ _raw_data_repr(self.data),
+ )
+
+ def serialize_body(self):
+ return self.data
+
+ def parse_body(self, data):
+ self.data = data.tobytes()
+ self.body_len = len(data)
+
+
+class AltSvcFrame(Frame):
+ """
+ The ALTSVC frame is used to advertise alternate services that the current
+ host, or a different one, can understand. This frame is standardised as
+ part of RFC 7838.
+
+ This frame does no work to validate that the ALTSVC field parameter is
+ acceptable per the rules of RFC 7838.
+
+ .. note:: If the ``stream_id`` of this frame is nonzero, the origin field
+ must have zero length. Conversely, if the ``stream_id`` of this
+ frame is zero, the origin field must have nonzero length. Put
+ another way, a valid ALTSVC frame has ``stream_id != 0`` XOR
+ ``len(origin) != 0``.
+ """
+ type = 0xA
+
+ stream_association = _STREAM_ASSOC_EITHER
+
+ def __init__(self, stream_id, origin=b'', field=b'', **kwargs):
+ super().__init__(stream_id, **kwargs)
+
+ if not isinstance(origin, bytes):
+ raise InvalidDataError("AltSvc origin must be bytestring.")
+ if not isinstance(field, bytes):
+ raise InvalidDataError("AltSvc field must be a bytestring.")
+ self.origin = origin
+ self.field = field
+
+ def _body_repr(self):
+ return "origin={}, field={}".format(
+ self.origin,
+ self.field,
+ )
+
+ def serialize_body(self):
+ origin_len = _STRUCT_H.pack(len(self.origin))
+ return b''.join([origin_len, self.origin, self.field])
+
+ def parse_body(self, data):
+ try:
+ origin_len = _STRUCT_H.unpack(data[0:2])[0]
+ self.origin = data[2:2+origin_len].tobytes()
+
+ if len(self.origin) != origin_len:
+ raise InvalidFrameError("Invalid ALTSVC frame body.")
+
+ self.field = data[2+origin_len:].tobytes()
+ except (struct.error, ValueError):
+ raise InvalidFrameError("Invalid ALTSVC frame body.")
+
+ self.body_len = len(data)
+
+
+class ExtensionFrame(Frame):
+ """
+ ExtensionFrame is used to wrap frames which are not natively interpretable
+ by hyperframe.
+
+ Although certain byte prefixes are ordained by specification to have
+ certain contextual meanings, frames with other prefixes are not prohibited,
+ and may be used to communicate arbitrary meaning between HTTP/2 peers.
+
+ Thus, hyperframe, rather than raising an exception when such a frame is
+ encountered, wraps it in a generic frame to be properly acted upon by
+ upstream consumers which might have additional context on how to use it.
+
+ .. versionadded:: 5.0.0
+ """
+
+ stream_association = _STREAM_ASSOC_EITHER
+
+ def __init__(self, type, stream_id, flag_byte=0x0, body=b'', **kwargs):
+ super().__init__(stream_id, **kwargs)
+ self.type = type
+ self.flag_byte = flag_byte
+ self.body = body
+
+ def _body_repr(self):
+ return "type={}, flag_byte={}, body={}".format(
+ self.type,
+ self.flag_byte,
+ _raw_data_repr(self.body),
+ )
+
+ def parse_flags(self, flag_byte):
+ """
+ For extension frames, we parse the flags by just storing a flag byte.
+ """
+ self.flag_byte = flag_byte
+
+ def parse_body(self, data):
+ self.body = data.tobytes()
+ self.body_len = len(data)
+
+ def serialize(self):
+ """
+ A broad override of the serialize method that ensures that the data
+ comes back out exactly as it came in. This should not be used in most
+ user code: it exists only as a helper method if frames need to be
+ reconstituted.
+ """
+ # Build the frame header.
+ # First, get the flags.
+ flags = self.flag_byte
+
+ header = _STRUCT_HBBBL.pack(
+ (self.body_len >> 8) & 0xFFFF, # Length spread over top 24 bits
+ self.body_len & 0xFF,
+ self.type,
+ flags,
+ self.stream_id & 0x7FFFFFFF # Stream ID is 32 bits.
+ )
+
+ return header + self.body
+
+
+def _raw_data_repr(data):
+ if not data:
+ return "None"
+ r = binascii.hexlify(data).decode('ascii')
+ if len(r) > 20:
+ r = r[:20] + "..."
+ return ""
+
+
+_FRAME_CLASSES = [
+ DataFrame,
+ HeadersFrame,
+ PriorityFrame,
+ RstStreamFrame,
+ SettingsFrame,
+ PushPromiseFrame,
+ PingFrame,
+ GoAwayFrame,
+ WindowUpdateFrame,
+ ContinuationFrame,
+ AltSvcFrame,
+]
+#: FRAMES maps the type byte for each frame to the class used to represent that
+#: frame.
+FRAMES = {cls.type: cls for cls in _FRAME_CLASSES}
diff --git a/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/INSTALLER b/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/INSTALLER
new file mode 100644
index 0000000..a1b589e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/INSTALLER
@@ -0,0 +1 @@
+pip
diff --git a/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/LICENSE.rst b/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/LICENSE.rst
new file mode 100644
index 0000000..63664b8
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/LICENSE.rst
@@ -0,0 +1,34 @@
+License
+-------
+
+License: bsd-3-clause
+
+Copyright (c) 2013-2020, Kim Davies. All rights reserved.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are met:
+
+#. Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+
+#. Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions and the following
+ disclaimer in the documentation and/or other materials provided with
+ the distribution.
+
+#. Neither the name of the copyright holder nor the names of the
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+#. THIS SOFTWARE IS PROVIDED BY THE CONTRIBUTORS "AS IS" AND ANY
+ EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS OR
+ CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+ LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+ DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+ THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE
+ USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH
+ DAMAGE.
diff --git a/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/METADATA b/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/METADATA
new file mode 100644
index 0000000..f73c0ff
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/METADATA
@@ -0,0 +1,243 @@
+Metadata-Version: 2.1
+Name: idna
+Version: 2.10
+Summary: Internationalized Domain Names in Applications (IDNA)
+Home-page: https://github.com/kjd/idna
+Author: Kim Davies
+Author-email: kim@cynosure.com.au
+License: BSD-like
+Platform: UNKNOWN
+Classifier: Development Status :: 5 - Production/Stable
+Classifier: Intended Audience :: Developers
+Classifier: Intended Audience :: System Administrators
+Classifier: License :: OSI Approved :: BSD License
+Classifier: Operating System :: OS Independent
+Classifier: Programming Language :: Python
+Classifier: Programming Language :: Python :: 2
+Classifier: Programming Language :: Python :: 2.7
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3.4
+Classifier: Programming Language :: Python :: 3.5
+Classifier: Programming Language :: Python :: 3.6
+Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Programming Language :: Python :: Implementation :: CPython
+Classifier: Programming Language :: Python :: Implementation :: PyPy
+Classifier: Topic :: Internet :: Name Service (DNS)
+Classifier: Topic :: Software Development :: Libraries :: Python Modules
+Classifier: Topic :: Utilities
+Requires-Python: >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*
+
+Internationalized Domain Names in Applications (IDNA)
+=====================================================
+
+Support for the Internationalised Domain Names in Applications
+(IDNA) protocol as specified in `RFC 5891 `_.
+This is the latest version of the protocol and is sometimes referred to as
+“IDNA 2008”.
+
+This library also provides support for Unicode Technical Standard 46,
+`Unicode IDNA Compatibility Processing `_.
+
+This acts as a suitable replacement for the “encodings.idna” module that
+comes with the Python standard library, but only supports the
+old, deprecated IDNA specification (`RFC 3490 `_).
+
+Basic functions are simply executed:
+
+.. code-block:: pycon
+
+ # Python 3
+ >>> import idna
+ >>> idna.encode('ドメイン.テスト')
+ b'xn--eckwd4c7c.xn--zckzah'
+ >>> print(idna.decode('xn--eckwd4c7c.xn--zckzah'))
+ ドメイン.テスト
+
+ # Python 2
+ >>> import idna
+ >>> idna.encode(u'ドメイン.テスト')
+ 'xn--eckwd4c7c.xn--zckzah'
+ >>> print idna.decode('xn--eckwd4c7c.xn--zckzah')
+ ドメイン.テスト
+
+Packages
+--------
+
+The latest tagged release version is published in the PyPI repository:
+
+.. image:: https://badge.fury.io/py/idna.svg
+ :target: http://badge.fury.io/py/idna
+
+
+Installation
+------------
+
+To install this library, you can use pip:
+
+.. code-block:: bash
+
+ $ pip install idna
+
+Alternatively, you can install the package using the bundled setup script:
+
+.. code-block:: bash
+
+ $ python setup.py install
+
+This library works with Python 2.7 and Python 3.4 or later.
+
+
+Usage
+-----
+
+For typical usage, the ``encode`` and ``decode`` functions will take a domain
+name argument and perform a conversion to A-labels or U-labels respectively.
+
+.. code-block:: pycon
+
+ # Python 3
+ >>> import idna
+ >>> idna.encode('ドメイン.テスト')
+ b'xn--eckwd4c7c.xn--zckzah'
+ >>> print(idna.decode('xn--eckwd4c7c.xn--zckzah'))
+ ドメイン.テスト
+
+You may use the codec encoding and decoding methods using the
+``idna.codec`` module:
+
+.. code-block:: pycon
+
+ # Python 2
+ >>> import idna.codec
+ >>> print u'домена.испытание'.encode('idna')
+ xn--80ahd1agd.xn--80akhbyknj4f
+ >>> print 'xn--80ahd1agd.xn--80akhbyknj4f'.decode('idna')
+ домена.испытание
+
+Conversions can be applied at a per-label basis using the ``ulabel`` or ``alabel``
+functions if necessary:
+
+.. code-block:: pycon
+
+ # Python 2
+ >>> idna.alabel(u'测试')
+ 'xn--0zwm56d'
+
+Compatibility Mapping (UTS #46)
++++++++++++++++++++++++++++++++
+
+As described in `RFC 5895 `_, the IDNA
+specification no longer normalizes input from different potential ways a user
+may input a domain name. This functionality, known as a “mapping”, is now
+considered by the specification to be a local user-interface issue distinct
+from IDNA conversion functionality.
+
+This library provides one such mapping, that was developed by the Unicode
+Consortium. Known as `Unicode IDNA Compatibility Processing `_,
+it provides for both a regular mapping for typical applications, as well as
+a transitional mapping to help migrate from older IDNA 2003 applications.
+
+For example, “Königsgäßchen” is not a permissible label as *LATIN CAPITAL
+LETTER K* is not allowed (nor are capital letters in general). UTS 46 will
+convert this into lower case prior to applying the IDNA conversion.
+
+.. code-block:: pycon
+
+ # Python 3
+ >>> import idna
+ >>> idna.encode(u'Königsgäßchen')
+ ...
+ idna.core.InvalidCodepoint: Codepoint U+004B at position 1 of 'Königsgäßchen' not allowed
+ >>> idna.encode('Königsgäßchen', uts46=True)
+ b'xn--knigsgchen-b4a3dun'
+ >>> print(idna.decode('xn--knigsgchen-b4a3dun'))
+ königsgäßchen
+
+Transitional processing provides conversions to help transition from the older
+2003 standard to the current standard. For example, in the original IDNA
+specification, the *LATIN SMALL LETTER SHARP S* (ß) was converted into two
+*LATIN SMALL LETTER S* (ss), whereas in the current IDNA specification this
+conversion is not performed.
+
+.. code-block:: pycon
+
+ # Python 2
+ >>> idna.encode(u'Königsgäßchen', uts46=True, transitional=True)
+ 'xn--knigsgsschen-lcb0w'
+
+Implementors should use transitional processing with caution, only in rare
+cases where conversion from legacy labels to current labels must be performed
+(i.e. IDNA implementations that pre-date 2008). For typical applications
+that just need to convert labels, transitional processing is unlikely to be
+beneficial and could produce unexpected incompatible results.
+
+``encodings.idna`` Compatibility
+++++++++++++++++++++++++++++++++
+
+Function calls from the Python built-in ``encodings.idna`` module are
+mapped to their IDNA 2008 equivalents using the ``idna.compat`` module.
+Simply substitute the ``import`` clause in your code to refer to the
+new module name.
+
+Exceptions
+----------
+
+All errors raised during the conversion following the specification should
+raise an exception derived from the ``idna.IDNAError`` base class.
+
+More specific exceptions that may be generated as ``idna.IDNABidiError``
+when the error reflects an illegal combination of left-to-right and right-to-left
+characters in a label; ``idna.InvalidCodepoint`` when a specific codepoint is
+an illegal character in an IDN label (i.e. INVALID); and ``idna.InvalidCodepointContext``
+when the codepoint is illegal based on its positional context (i.e. it is CONTEXTO
+or CONTEXTJ but the contextual requirements are not satisfied.)
+
+Building and Diagnostics
+------------------------
+
+The IDNA and UTS 46 functionality relies upon pre-calculated lookup tables for
+performance. These tables are derived from computing against eligibility criteria
+in the respective standards. These tables are computed using the command-line
+script ``tools/idna-data``.
+
+This tool will fetch relevant tables from the Unicode Consortium and perform the
+required calculations to identify eligibility. It has three main modes:
+
+* ``idna-data make-libdata``. Generates ``idnadata.py`` and ``uts46data.py``,
+ the pre-calculated lookup tables using for IDNA and UTS 46 conversions. Implementors
+ who wish to track this library against a different Unicode version may use this tool
+ to manually generate a different version of the ``idnadata.py`` and ``uts46data.py``
+ files.
+
+* ``idna-data make-table``. Generate a table of the IDNA disposition
+ (e.g. PVALID, CONTEXTJ, CONTEXTO) in the format found in Appendix B.1 of RFC
+ 5892 and the pre-computed tables published by `IANA `_.
+
+* ``idna-data U+0061``. Prints debugging output on the various properties
+ associated with an individual Unicode codepoint (in this case, U+0061), that are
+ used to assess the IDNA and UTS 46 status of a codepoint. This is helpful in debugging
+ or analysis.
+
+The tool accepts a number of arguments, described using ``idna-data -h``. Most notably,
+the ``--version`` argument allows the specification of the version of Unicode to use
+in computing the table data. For example, ``idna-data --version 9.0.0 make-libdata``
+will generate library data against Unicode 9.0.0.
+
+Note that this script requires Python 3, but all generated library data will work
+in Python 2.7.
+
+
+Testing
+-------
+
+The library has a test suite based on each rule of the IDNA specification, as
+well as tests that are provided as part of the Unicode Technical Standard 46,
+`Unicode IDNA Compatibility Processing `_.
+
+The tests are run automatically on each commit at Travis CI:
+
+.. image:: https://travis-ci.org/kjd/idna.svg?branch=master
+ :target: https://travis-ci.org/kjd/idna
+
+
diff --git a/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/RECORD b/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/RECORD
new file mode 100644
index 0000000..465a67d
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/RECORD
@@ -0,0 +1,22 @@
+idna-2.10.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
+idna-2.10.dist-info/LICENSE.rst,sha256=QSAUQg0kc9ugYRfD1Nng7sqm3eDKMM2VH07CvjlCbzI,1565
+idna-2.10.dist-info/METADATA,sha256=ZWCaQDBjdmSvx5EU7Cv6ORC-9NUQ6nXh1eXx38ySe40,9104
+idna-2.10.dist-info/RECORD,,
+idna-2.10.dist-info/WHEEL,sha256=8zNYZbwQSXoB9IfXOjPfeNwvAsALAjffgk27FqvCWbo,110
+idna-2.10.dist-info/top_level.txt,sha256=jSag9sEDqvSPftxOQy-ABfGV_RSy7oFh4zZJpODV8k0,5
+idna/__init__.py,sha256=9Nt7xpyet3DmOrPUGooDdAwmHZZu1qUAy2EaJ93kGiQ,58
+idna/__pycache__/__init__.cpython-39.pyc,,
+idna/__pycache__/codec.cpython-39.pyc,,
+idna/__pycache__/compat.cpython-39.pyc,,
+idna/__pycache__/core.cpython-39.pyc,,
+idna/__pycache__/idnadata.cpython-39.pyc,,
+idna/__pycache__/intranges.cpython-39.pyc,,
+idna/__pycache__/package_data.cpython-39.pyc,,
+idna/__pycache__/uts46data.cpython-39.pyc,,
+idna/codec.py,sha256=lvYb7yu7PhAqFaAIAdWcwgaWI2UmgseUua-1c0AsG0A,3299
+idna/compat.py,sha256=R-h29D-6mrnJzbXxymrWUW7iZUvy-26TQwZ0ij57i4U,232
+idna/core.py,sha256=jCoaLb3bA2tS_DDx9PpGuNTEZZN2jAzB369aP-IHYRE,11951
+idna/idnadata.py,sha256=gmzFwZWjdms3kKZ_M_vwz7-LP_SCgYfSeE03B21Qpsk,42350
+idna/intranges.py,sha256=TY1lpxZIQWEP6tNqjZkFA5hgoMWOj1OBmnUG8ihT87E,1749
+idna/package_data.py,sha256=bxBjpLnE06_1jSYKEy5svOMu1zM3OMztXVUb1tPlcp0,22
+idna/uts46data.py,sha256=lMdw2zdjkH1JUWXPPEfFUSYT3Fyj60bBmfLvvy5m7ko,202084
diff --git a/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/WHEEL b/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/WHEEL
new file mode 100644
index 0000000..8b701e9
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/WHEEL
@@ -0,0 +1,6 @@
+Wheel-Version: 1.0
+Generator: bdist_wheel (0.33.6)
+Root-Is-Purelib: true
+Tag: py2-none-any
+Tag: py3-none-any
+
diff --git a/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/top_level.txt b/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/top_level.txt
new file mode 100644
index 0000000..c40472e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/idna-2.10.dist-info/top_level.txt
@@ -0,0 +1 @@
+idna
diff --git a/.venv/lib/python3.9/site-packages/idna/__init__.py b/.venv/lib/python3.9/site-packages/idna/__init__.py
new file mode 100644
index 0000000..847bf93
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/idna/__init__.py
@@ -0,0 +1,2 @@
+from .package_data import __version__
+from .core import *
diff --git a/.venv/lib/python3.9/site-packages/idna/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/idna/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..139a6b5
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/idna/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/idna/__pycache__/codec.cpython-39.pyc b/.venv/lib/python3.9/site-packages/idna/__pycache__/codec.cpython-39.pyc
new file mode 100644
index 0000000..470e0d6
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/idna/__pycache__/codec.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/idna/__pycache__/compat.cpython-39.pyc b/.venv/lib/python3.9/site-packages/idna/__pycache__/compat.cpython-39.pyc
new file mode 100644
index 0000000..3b77c25
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/idna/__pycache__/compat.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/idna/__pycache__/core.cpython-39.pyc b/.venv/lib/python3.9/site-packages/idna/__pycache__/core.cpython-39.pyc
new file mode 100644
index 0000000..988cdad
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/idna/__pycache__/core.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/idna/__pycache__/idnadata.cpython-39.pyc b/.venv/lib/python3.9/site-packages/idna/__pycache__/idnadata.cpython-39.pyc
new file mode 100644
index 0000000..38a7ca3
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/idna/__pycache__/idnadata.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/idna/__pycache__/intranges.cpython-39.pyc b/.venv/lib/python3.9/site-packages/idna/__pycache__/intranges.cpython-39.pyc
new file mode 100644
index 0000000..0b2da5a
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/idna/__pycache__/intranges.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/idna/__pycache__/package_data.cpython-39.pyc b/.venv/lib/python3.9/site-packages/idna/__pycache__/package_data.cpython-39.pyc
new file mode 100644
index 0000000..c4bfc4a
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/idna/__pycache__/package_data.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/idna/__pycache__/uts46data.cpython-39.pyc b/.venv/lib/python3.9/site-packages/idna/__pycache__/uts46data.cpython-39.pyc
new file mode 100644
index 0000000..96845bb
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/idna/__pycache__/uts46data.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/idna/codec.py b/.venv/lib/python3.9/site-packages/idna/codec.py
new file mode 100644
index 0000000..98c65ea
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/idna/codec.py
@@ -0,0 +1,118 @@
+from .core import encode, decode, alabel, ulabel, IDNAError
+import codecs
+import re
+
+_unicode_dots_re = re.compile(u'[\u002e\u3002\uff0e\uff61]')
+
+class Codec(codecs.Codec):
+
+ def encode(self, data, errors='strict'):
+
+ if errors != 'strict':
+ raise IDNAError("Unsupported error handling \"{0}\"".format(errors))
+
+ if not data:
+ return "", 0
+
+ return encode(data), len(data)
+
+ def decode(self, data, errors='strict'):
+
+ if errors != 'strict':
+ raise IDNAError("Unsupported error handling \"{0}\"".format(errors))
+
+ if not data:
+ return u"", 0
+
+ return decode(data), len(data)
+
+class IncrementalEncoder(codecs.BufferedIncrementalEncoder):
+ def _buffer_encode(self, data, errors, final):
+ if errors != 'strict':
+ raise IDNAError("Unsupported error handling \"{0}\"".format(errors))
+
+ if not data:
+ return ("", 0)
+
+ labels = _unicode_dots_re.split(data)
+ trailing_dot = u''
+ if labels:
+ if not labels[-1]:
+ trailing_dot = '.'
+ del labels[-1]
+ elif not final:
+ # Keep potentially unfinished label until the next call
+ del labels[-1]
+ if labels:
+ trailing_dot = '.'
+
+ result = []
+ size = 0
+ for label in labels:
+ result.append(alabel(label))
+ if size:
+ size += 1
+ size += len(label)
+
+ # Join with U+002E
+ result = ".".join(result) + trailing_dot
+ size += len(trailing_dot)
+ return (result, size)
+
+class IncrementalDecoder(codecs.BufferedIncrementalDecoder):
+ def _buffer_decode(self, data, errors, final):
+ if errors != 'strict':
+ raise IDNAError("Unsupported error handling \"{0}\"".format(errors))
+
+ if not data:
+ return (u"", 0)
+
+ # IDNA allows decoding to operate on Unicode strings, too.
+ if isinstance(data, unicode):
+ labels = _unicode_dots_re.split(data)
+ else:
+ # Must be ASCII string
+ data = str(data)
+ unicode(data, "ascii")
+ labels = data.split(".")
+
+ trailing_dot = u''
+ if labels:
+ if not labels[-1]:
+ trailing_dot = u'.'
+ del labels[-1]
+ elif not final:
+ # Keep potentially unfinished label until the next call
+ del labels[-1]
+ if labels:
+ trailing_dot = u'.'
+
+ result = []
+ size = 0
+ for label in labels:
+ result.append(ulabel(label))
+ if size:
+ size += 1
+ size += len(label)
+
+ result = u".".join(result) + trailing_dot
+ size += len(trailing_dot)
+ return (result, size)
+
+
+class StreamWriter(Codec, codecs.StreamWriter):
+ pass
+
+class StreamReader(Codec, codecs.StreamReader):
+ pass
+
+def getregentry():
+ return codecs.CodecInfo(
+ name='idna',
+ encode=Codec().encode,
+ decode=Codec().decode,
+ incrementalencoder=IncrementalEncoder,
+ incrementaldecoder=IncrementalDecoder,
+ streamwriter=StreamWriter,
+ streamreader=StreamReader,
+ )
diff --git a/.venv/lib/python3.9/site-packages/idna/compat.py b/.venv/lib/python3.9/site-packages/idna/compat.py
new file mode 100644
index 0000000..4d47f33
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/idna/compat.py
@@ -0,0 +1,12 @@
+from .core import *
+from .codec import *
+
+def ToASCII(label):
+ return encode(label)
+
+def ToUnicode(label):
+ return decode(label)
+
+def nameprep(s):
+ raise NotImplementedError("IDNA 2008 does not utilise nameprep protocol")
+
diff --git a/.venv/lib/python3.9/site-packages/idna/core.py b/.venv/lib/python3.9/site-packages/idna/core.py
new file mode 100644
index 0000000..41ec5c7
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/idna/core.py
@@ -0,0 +1,400 @@
+from . import idnadata
+import bisect
+import unicodedata
+import re
+import sys
+from .intranges import intranges_contain
+
+_virama_combining_class = 9
+_alabel_prefix = b'xn--'
+_unicode_dots_re = re.compile(u'[\u002e\u3002\uff0e\uff61]')
+
+if sys.version_info[0] >= 3:
+ unicode = str
+ unichr = chr
+
+class IDNAError(UnicodeError):
+ """ Base exception for all IDNA-encoding related problems """
+ pass
+
+
+class IDNABidiError(IDNAError):
+ """ Exception when bidirectional requirements are not satisfied """
+ pass
+
+
+class InvalidCodepoint(IDNAError):
+ """ Exception when a disallowed or unallocated codepoint is used """
+ pass
+
+
+class InvalidCodepointContext(IDNAError):
+ """ Exception when the codepoint is not valid in the context it is used """
+ pass
+
+
+def _combining_class(cp):
+ v = unicodedata.combining(unichr(cp))
+ if v == 0:
+ if not unicodedata.name(unichr(cp)):
+ raise ValueError("Unknown character in unicodedata")
+ return v
+
+def _is_script(cp, script):
+ return intranges_contain(ord(cp), idnadata.scripts[script])
+
+def _punycode(s):
+ return s.encode('punycode')
+
+def _unot(s):
+ return 'U+{0:04X}'.format(s)
+
+
+def valid_label_length(label):
+
+ if len(label) > 63:
+ return False
+ return True
+
+
+def valid_string_length(label, trailing_dot):
+
+ if len(label) > (254 if trailing_dot else 253):
+ return False
+ return True
+
+
+def check_bidi(label, check_ltr=False):
+
+ # Bidi rules should only be applied if string contains RTL characters
+ bidi_label = False
+ for (idx, cp) in enumerate(label, 1):
+ direction = unicodedata.bidirectional(cp)
+ if direction == '':
+ # String likely comes from a newer version of Unicode
+ raise IDNABidiError('Unknown directionality in label {0} at position {1}'.format(repr(label), idx))
+ if direction in ['R', 'AL', 'AN']:
+ bidi_label = True
+ if not bidi_label and not check_ltr:
+ return True
+
+ # Bidi rule 1
+ direction = unicodedata.bidirectional(label[0])
+ if direction in ['R', 'AL']:
+ rtl = True
+ elif direction == 'L':
+ rtl = False
+ else:
+ raise IDNABidiError('First codepoint in label {0} must be directionality L, R or AL'.format(repr(label)))
+
+ valid_ending = False
+ number_type = False
+ for (idx, cp) in enumerate(label, 1):
+ direction = unicodedata.bidirectional(cp)
+
+ if rtl:
+ # Bidi rule 2
+ if not direction in ['R', 'AL', 'AN', 'EN', 'ES', 'CS', 'ET', 'ON', 'BN', 'NSM']:
+ raise IDNABidiError('Invalid direction for codepoint at position {0} in a right-to-left label'.format(idx))
+ # Bidi rule 3
+ if direction in ['R', 'AL', 'EN', 'AN']:
+ valid_ending = True
+ elif direction != 'NSM':
+ valid_ending = False
+ # Bidi rule 4
+ if direction in ['AN', 'EN']:
+ if not number_type:
+ number_type = direction
+ else:
+ if number_type != direction:
+ raise IDNABidiError('Can not mix numeral types in a right-to-left label')
+ else:
+ # Bidi rule 5
+ if not direction in ['L', 'EN', 'ES', 'CS', 'ET', 'ON', 'BN', 'NSM']:
+ raise IDNABidiError('Invalid direction for codepoint at position {0} in a left-to-right label'.format(idx))
+ # Bidi rule 6
+ if direction in ['L', 'EN']:
+ valid_ending = True
+ elif direction != 'NSM':
+ valid_ending = False
+
+ if not valid_ending:
+ raise IDNABidiError('Label ends with illegal codepoint directionality')
+
+ return True
+
+
+def check_initial_combiner(label):
+
+ if unicodedata.category(label[0])[0] == 'M':
+ raise IDNAError('Label begins with an illegal combining character')
+ return True
+
+
+def check_hyphen_ok(label):
+
+ if label[2:4] == '--':
+ raise IDNAError('Label has disallowed hyphens in 3rd and 4th position')
+ if label[0] == '-' or label[-1] == '-':
+ raise IDNAError('Label must not start or end with a hyphen')
+ return True
+
+
+def check_nfc(label):
+
+ if unicodedata.normalize('NFC', label) != label:
+ raise IDNAError('Label must be in Normalization Form C')
+
+
+def valid_contextj(label, pos):
+
+ cp_value = ord(label[pos])
+
+ if cp_value == 0x200c:
+
+ if pos > 0:
+ if _combining_class(ord(label[pos - 1])) == _virama_combining_class:
+ return True
+
+ ok = False
+ for i in range(pos-1, -1, -1):
+ joining_type = idnadata.joining_types.get(ord(label[i]))
+ if joining_type == ord('T'):
+ continue
+ if joining_type in [ord('L'), ord('D')]:
+ ok = True
+ break
+
+ if not ok:
+ return False
+
+ ok = False
+ for i in range(pos+1, len(label)):
+ joining_type = idnadata.joining_types.get(ord(label[i]))
+ if joining_type == ord('T'):
+ continue
+ if joining_type in [ord('R'), ord('D')]:
+ ok = True
+ break
+ return ok
+
+ if cp_value == 0x200d:
+
+ if pos > 0:
+ if _combining_class(ord(label[pos - 1])) == _virama_combining_class:
+ return True
+ return False
+
+ else:
+
+ return False
+
+
+def valid_contexto(label, pos, exception=False):
+
+ cp_value = ord(label[pos])
+
+ if cp_value == 0x00b7:
+ if 0 < pos < len(label)-1:
+ if ord(label[pos - 1]) == 0x006c and ord(label[pos + 1]) == 0x006c:
+ return True
+ return False
+
+ elif cp_value == 0x0375:
+ if pos < len(label)-1 and len(label) > 1:
+ return _is_script(label[pos + 1], 'Greek')
+ return False
+
+ elif cp_value == 0x05f3 or cp_value == 0x05f4:
+ if pos > 0:
+ return _is_script(label[pos - 1], 'Hebrew')
+ return False
+
+ elif cp_value == 0x30fb:
+ for cp in label:
+ if cp == u'\u30fb':
+ continue
+ if _is_script(cp, 'Hiragana') or _is_script(cp, 'Katakana') or _is_script(cp, 'Han'):
+ return True
+ return False
+
+ elif 0x660 <= cp_value <= 0x669:
+ for cp in label:
+ if 0x6f0 <= ord(cp) <= 0x06f9:
+ return False
+ return True
+
+ elif 0x6f0 <= cp_value <= 0x6f9:
+ for cp in label:
+ if 0x660 <= ord(cp) <= 0x0669:
+ return False
+ return True
+
+
+def check_label(label):
+
+ if isinstance(label, (bytes, bytearray)):
+ label = label.decode('utf-8')
+ if len(label) == 0:
+ raise IDNAError('Empty Label')
+
+ check_nfc(label)
+ check_hyphen_ok(label)
+ check_initial_combiner(label)
+
+ for (pos, cp) in enumerate(label):
+ cp_value = ord(cp)
+ if intranges_contain(cp_value, idnadata.codepoint_classes['PVALID']):
+ continue
+ elif intranges_contain(cp_value, idnadata.codepoint_classes['CONTEXTJ']):
+ try:
+ if not valid_contextj(label, pos):
+ raise InvalidCodepointContext('Joiner {0} not allowed at position {1} in {2}'.format(
+ _unot(cp_value), pos+1, repr(label)))
+ except ValueError:
+ raise IDNAError('Unknown codepoint adjacent to joiner {0} at position {1} in {2}'.format(
+ _unot(cp_value), pos+1, repr(label)))
+ elif intranges_contain(cp_value, idnadata.codepoint_classes['CONTEXTO']):
+ if not valid_contexto(label, pos):
+ raise InvalidCodepointContext('Codepoint {0} not allowed at position {1} in {2}'.format(_unot(cp_value), pos+1, repr(label)))
+ else:
+ raise InvalidCodepoint('Codepoint {0} at position {1} of {2} not allowed'.format(_unot(cp_value), pos+1, repr(label)))
+
+ check_bidi(label)
+
+
+def alabel(label):
+
+ try:
+ label = label.encode('ascii')
+ ulabel(label)
+ if not valid_label_length(label):
+ raise IDNAError('Label too long')
+ return label
+ except UnicodeEncodeError:
+ pass
+
+ if not label:
+ raise IDNAError('No Input')
+
+ label = unicode(label)
+ check_label(label)
+ label = _punycode(label)
+ label = _alabel_prefix + label
+
+ if not valid_label_length(label):
+ raise IDNAError('Label too long')
+
+ return label
+
+
+def ulabel(label):
+
+ if not isinstance(label, (bytes, bytearray)):
+ try:
+ label = label.encode('ascii')
+ except UnicodeEncodeError:
+ check_label(label)
+ return label
+
+ label = label.lower()
+ if label.startswith(_alabel_prefix):
+ label = label[len(_alabel_prefix):]
+ if not label:
+ raise IDNAError('Malformed A-label, no Punycode eligible content found')
+ if label.decode('ascii')[-1] == '-':
+ raise IDNAError('A-label must not end with a hyphen')
+ else:
+ check_label(label)
+ return label.decode('ascii')
+
+ label = label.decode('punycode')
+ check_label(label)
+ return label
+
+
+def uts46_remap(domain, std3_rules=True, transitional=False):
+ """Re-map the characters in the string according to UTS46 processing."""
+ from .uts46data import uts46data
+ output = u""
+ try:
+ for pos, char in enumerate(domain):
+ code_point = ord(char)
+ uts46row = uts46data[code_point if code_point < 256 else
+ bisect.bisect_left(uts46data, (code_point, "Z")) - 1]
+ status = uts46row[1]
+ replacement = uts46row[2] if len(uts46row) == 3 else None
+ if (status == "V" or
+ (status == "D" and not transitional) or
+ (status == "3" and not std3_rules and replacement is None)):
+ output += char
+ elif replacement is not None and (status == "M" or
+ (status == "3" and not std3_rules) or
+ (status == "D" and transitional)):
+ output += replacement
+ elif status != "I":
+ raise IndexError()
+ return unicodedata.normalize("NFC", output)
+ except IndexError:
+ raise InvalidCodepoint(
+ "Codepoint {0} not allowed at position {1} in {2}".format(
+ _unot(code_point), pos + 1, repr(domain)))
+
+
+def encode(s, strict=False, uts46=False, std3_rules=False, transitional=False):
+
+ if isinstance(s, (bytes, bytearray)):
+ s = s.decode("ascii")
+ if uts46:
+ s = uts46_remap(s, std3_rules, transitional)
+ trailing_dot = False
+ result = []
+ if strict:
+ labels = s.split('.')
+ else:
+ labels = _unicode_dots_re.split(s)
+ if not labels or labels == ['']:
+ raise IDNAError('Empty domain')
+ if labels[-1] == '':
+ del labels[-1]
+ trailing_dot = True
+ for label in labels:
+ s = alabel(label)
+ if s:
+ result.append(s)
+ else:
+ raise IDNAError('Empty label')
+ if trailing_dot:
+ result.append(b'')
+ s = b'.'.join(result)
+ if not valid_string_length(s, trailing_dot):
+ raise IDNAError('Domain too long')
+ return s
+
+
+def decode(s, strict=False, uts46=False, std3_rules=False):
+
+ if isinstance(s, (bytes, bytearray)):
+ s = s.decode("ascii")
+ if uts46:
+ s = uts46_remap(s, std3_rules, False)
+ trailing_dot = False
+ result = []
+ if not strict:
+ labels = _unicode_dots_re.split(s)
+ else:
+ labels = s.split(u'.')
+ if not labels or labels == ['']:
+ raise IDNAError('Empty domain')
+ if not labels[-1]:
+ del labels[-1]
+ trailing_dot = True
+ for label in labels:
+ s = ulabel(label)
+ if s:
+ result.append(s)
+ else:
+ raise IDNAError('Empty label')
+ if trailing_dot:
+ result.append(u'')
+ return u'.'.join(result)
diff --git a/.venv/lib/python3.9/site-packages/idna/idnadata.py b/.venv/lib/python3.9/site-packages/idna/idnadata.py
new file mode 100644
index 0000000..a284e4c
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/idna/idnadata.py
@@ -0,0 +1,2050 @@
+# This file is automatically generated by tools/idna-data
+
+__version__ = "13.0.0"
+scripts = {
+ 'Greek': (
+ 0x37000000374,
+ 0x37500000378,
+ 0x37a0000037e,
+ 0x37f00000380,
+ 0x38400000385,
+ 0x38600000387,
+ 0x3880000038b,
+ 0x38c0000038d,
+ 0x38e000003a2,
+ 0x3a3000003e2,
+ 0x3f000000400,
+ 0x1d2600001d2b,
+ 0x1d5d00001d62,
+ 0x1d6600001d6b,
+ 0x1dbf00001dc0,
+ 0x1f0000001f16,
+ 0x1f1800001f1e,
+ 0x1f2000001f46,
+ 0x1f4800001f4e,
+ 0x1f5000001f58,
+ 0x1f5900001f5a,
+ 0x1f5b00001f5c,
+ 0x1f5d00001f5e,
+ 0x1f5f00001f7e,
+ 0x1f8000001fb5,
+ 0x1fb600001fc5,
+ 0x1fc600001fd4,
+ 0x1fd600001fdc,
+ 0x1fdd00001ff0,
+ 0x1ff200001ff5,
+ 0x1ff600001fff,
+ 0x212600002127,
+ 0xab650000ab66,
+ 0x101400001018f,
+ 0x101a0000101a1,
+ 0x1d2000001d246,
+ ),
+ 'Han': (
+ 0x2e8000002e9a,
+ 0x2e9b00002ef4,
+ 0x2f0000002fd6,
+ 0x300500003006,
+ 0x300700003008,
+ 0x30210000302a,
+ 0x30380000303c,
+ 0x340000004dc0,
+ 0x4e0000009ffd,
+ 0xf9000000fa6e,
+ 0xfa700000fada,
+ 0x16ff000016ff2,
+ 0x200000002a6de,
+ 0x2a7000002b735,
+ 0x2b7400002b81e,
+ 0x2b8200002cea2,
+ 0x2ceb00002ebe1,
+ 0x2f8000002fa1e,
+ 0x300000003134b,
+ ),
+ 'Hebrew': (
+ 0x591000005c8,
+ 0x5d0000005eb,
+ 0x5ef000005f5,
+ 0xfb1d0000fb37,
+ 0xfb380000fb3d,
+ 0xfb3e0000fb3f,
+ 0xfb400000fb42,
+ 0xfb430000fb45,
+ 0xfb460000fb50,
+ ),
+ 'Hiragana': (
+ 0x304100003097,
+ 0x309d000030a0,
+ 0x1b0010001b11f,
+ 0x1b1500001b153,
+ 0x1f2000001f201,
+ ),
+ 'Katakana': (
+ 0x30a1000030fb,
+ 0x30fd00003100,
+ 0x31f000003200,
+ 0x32d0000032ff,
+ 0x330000003358,
+ 0xff660000ff70,
+ 0xff710000ff9e,
+ 0x1b0000001b001,
+ 0x1b1640001b168,
+ ),
+}
+joining_types = {
+ 0x600: 85,
+ 0x601: 85,
+ 0x602: 85,
+ 0x603: 85,
+ 0x604: 85,
+ 0x605: 85,
+ 0x608: 85,
+ 0x60b: 85,
+ 0x620: 68,
+ 0x621: 85,
+ 0x622: 82,
+ 0x623: 82,
+ 0x624: 82,
+ 0x625: 82,
+ 0x626: 68,
+ 0x627: 82,
+ 0x628: 68,
+ 0x629: 82,
+ 0x62a: 68,
+ 0x62b: 68,
+ 0x62c: 68,
+ 0x62d: 68,
+ 0x62e: 68,
+ 0x62f: 82,
+ 0x630: 82,
+ 0x631: 82,
+ 0x632: 82,
+ 0x633: 68,
+ 0x634: 68,
+ 0x635: 68,
+ 0x636: 68,
+ 0x637: 68,
+ 0x638: 68,
+ 0x639: 68,
+ 0x63a: 68,
+ 0x63b: 68,
+ 0x63c: 68,
+ 0x63d: 68,
+ 0x63e: 68,
+ 0x63f: 68,
+ 0x640: 67,
+ 0x641: 68,
+ 0x642: 68,
+ 0x643: 68,
+ 0x644: 68,
+ 0x645: 68,
+ 0x646: 68,
+ 0x647: 68,
+ 0x648: 82,
+ 0x649: 68,
+ 0x64a: 68,
+ 0x66e: 68,
+ 0x66f: 68,
+ 0x671: 82,
+ 0x672: 82,
+ 0x673: 82,
+ 0x674: 85,
+ 0x675: 82,
+ 0x676: 82,
+ 0x677: 82,
+ 0x678: 68,
+ 0x679: 68,
+ 0x67a: 68,
+ 0x67b: 68,
+ 0x67c: 68,
+ 0x67d: 68,
+ 0x67e: 68,
+ 0x67f: 68,
+ 0x680: 68,
+ 0x681: 68,
+ 0x682: 68,
+ 0x683: 68,
+ 0x684: 68,
+ 0x685: 68,
+ 0x686: 68,
+ 0x687: 68,
+ 0x688: 82,
+ 0x689: 82,
+ 0x68a: 82,
+ 0x68b: 82,
+ 0x68c: 82,
+ 0x68d: 82,
+ 0x68e: 82,
+ 0x68f: 82,
+ 0x690: 82,
+ 0x691: 82,
+ 0x692: 82,
+ 0x693: 82,
+ 0x694: 82,
+ 0x695: 82,
+ 0x696: 82,
+ 0x697: 82,
+ 0x698: 82,
+ 0x699: 82,
+ 0x69a: 68,
+ 0x69b: 68,
+ 0x69c: 68,
+ 0x69d: 68,
+ 0x69e: 68,
+ 0x69f: 68,
+ 0x6a0: 68,
+ 0x6a1: 68,
+ 0x6a2: 68,
+ 0x6a3: 68,
+ 0x6a4: 68,
+ 0x6a5: 68,
+ 0x6a6: 68,
+ 0x6a7: 68,
+ 0x6a8: 68,
+ 0x6a9: 68,
+ 0x6aa: 68,
+ 0x6ab: 68,
+ 0x6ac: 68,
+ 0x6ad: 68,
+ 0x6ae: 68,
+ 0x6af: 68,
+ 0x6b0: 68,
+ 0x6b1: 68,
+ 0x6b2: 68,
+ 0x6b3: 68,
+ 0x6b4: 68,
+ 0x6b5: 68,
+ 0x6b6: 68,
+ 0x6b7: 68,
+ 0x6b8: 68,
+ 0x6b9: 68,
+ 0x6ba: 68,
+ 0x6bb: 68,
+ 0x6bc: 68,
+ 0x6bd: 68,
+ 0x6be: 68,
+ 0x6bf: 68,
+ 0x6c0: 82,
+ 0x6c1: 68,
+ 0x6c2: 68,
+ 0x6c3: 82,
+ 0x6c4: 82,
+ 0x6c5: 82,
+ 0x6c6: 82,
+ 0x6c7: 82,
+ 0x6c8: 82,
+ 0x6c9: 82,
+ 0x6ca: 82,
+ 0x6cb: 82,
+ 0x6cc: 68,
+ 0x6cd: 82,
+ 0x6ce: 68,
+ 0x6cf: 82,
+ 0x6d0: 68,
+ 0x6d1: 68,
+ 0x6d2: 82,
+ 0x6d3: 82,
+ 0x6d5: 82,
+ 0x6dd: 85,
+ 0x6ee: 82,
+ 0x6ef: 82,
+ 0x6fa: 68,
+ 0x6fb: 68,
+ 0x6fc: 68,
+ 0x6ff: 68,
+ 0x70f: 84,
+ 0x710: 82,
+ 0x712: 68,
+ 0x713: 68,
+ 0x714: 68,
+ 0x715: 82,
+ 0x716: 82,
+ 0x717: 82,
+ 0x718: 82,
+ 0x719: 82,
+ 0x71a: 68,
+ 0x71b: 68,
+ 0x71c: 68,
+ 0x71d: 68,
+ 0x71e: 82,
+ 0x71f: 68,
+ 0x720: 68,
+ 0x721: 68,
+ 0x722: 68,
+ 0x723: 68,
+ 0x724: 68,
+ 0x725: 68,
+ 0x726: 68,
+ 0x727: 68,
+ 0x728: 82,
+ 0x729: 68,
+ 0x72a: 82,
+ 0x72b: 68,
+ 0x72c: 82,
+ 0x72d: 68,
+ 0x72e: 68,
+ 0x72f: 82,
+ 0x74d: 82,
+ 0x74e: 68,
+ 0x74f: 68,
+ 0x750: 68,
+ 0x751: 68,
+ 0x752: 68,
+ 0x753: 68,
+ 0x754: 68,
+ 0x755: 68,
+ 0x756: 68,
+ 0x757: 68,
+ 0x758: 68,
+ 0x759: 82,
+ 0x75a: 82,
+ 0x75b: 82,
+ 0x75c: 68,
+ 0x75d: 68,
+ 0x75e: 68,
+ 0x75f: 68,
+ 0x760: 68,
+ 0x761: 68,
+ 0x762: 68,
+ 0x763: 68,
+ 0x764: 68,
+ 0x765: 68,
+ 0x766: 68,
+ 0x767: 68,
+ 0x768: 68,
+ 0x769: 68,
+ 0x76a: 68,
+ 0x76b: 82,
+ 0x76c: 82,
+ 0x76d: 68,
+ 0x76e: 68,
+ 0x76f: 68,
+ 0x770: 68,
+ 0x771: 82,
+ 0x772: 68,
+ 0x773: 82,
+ 0x774: 82,
+ 0x775: 68,
+ 0x776: 68,
+ 0x777: 68,
+ 0x778: 82,
+ 0x779: 82,
+ 0x77a: 68,
+ 0x77b: 68,
+ 0x77c: 68,
+ 0x77d: 68,
+ 0x77e: 68,
+ 0x77f: 68,
+ 0x7ca: 68,
+ 0x7cb: 68,
+ 0x7cc: 68,
+ 0x7cd: 68,
+ 0x7ce: 68,
+ 0x7cf: 68,
+ 0x7d0: 68,
+ 0x7d1: 68,
+ 0x7d2: 68,
+ 0x7d3: 68,
+ 0x7d4: 68,
+ 0x7d5: 68,
+ 0x7d6: 68,
+ 0x7d7: 68,
+ 0x7d8: 68,
+ 0x7d9: 68,
+ 0x7da: 68,
+ 0x7db: 68,
+ 0x7dc: 68,
+ 0x7dd: 68,
+ 0x7de: 68,
+ 0x7df: 68,
+ 0x7e0: 68,
+ 0x7e1: 68,
+ 0x7e2: 68,
+ 0x7e3: 68,
+ 0x7e4: 68,
+ 0x7e5: 68,
+ 0x7e6: 68,
+ 0x7e7: 68,
+ 0x7e8: 68,
+ 0x7e9: 68,
+ 0x7ea: 68,
+ 0x7fa: 67,
+ 0x840: 82,
+ 0x841: 68,
+ 0x842: 68,
+ 0x843: 68,
+ 0x844: 68,
+ 0x845: 68,
+ 0x846: 82,
+ 0x847: 82,
+ 0x848: 68,
+ 0x849: 82,
+ 0x84a: 68,
+ 0x84b: 68,
+ 0x84c: 68,
+ 0x84d: 68,
+ 0x84e: 68,
+ 0x84f: 68,
+ 0x850: 68,
+ 0x851: 68,
+ 0x852: 68,
+ 0x853: 68,
+ 0x854: 82,
+ 0x855: 68,
+ 0x856: 82,
+ 0x857: 82,
+ 0x858: 82,
+ 0x860: 68,
+ 0x861: 85,
+ 0x862: 68,
+ 0x863: 68,
+ 0x864: 68,
+ 0x865: 68,
+ 0x866: 85,
+ 0x867: 82,
+ 0x868: 68,
+ 0x869: 82,
+ 0x86a: 82,
+ 0x8a0: 68,
+ 0x8a1: 68,
+ 0x8a2: 68,
+ 0x8a3: 68,
+ 0x8a4: 68,
+ 0x8a5: 68,
+ 0x8a6: 68,
+ 0x8a7: 68,
+ 0x8a8: 68,
+ 0x8a9: 68,
+ 0x8aa: 82,
+ 0x8ab: 82,
+ 0x8ac: 82,
+ 0x8ad: 85,
+ 0x8ae: 82,
+ 0x8af: 68,
+ 0x8b0: 68,
+ 0x8b1: 82,
+ 0x8b2: 82,
+ 0x8b3: 68,
+ 0x8b4: 68,
+ 0x8b6: 68,
+ 0x8b7: 68,
+ 0x8b8: 68,
+ 0x8b9: 82,
+ 0x8ba: 68,
+ 0x8bb: 68,
+ 0x8bc: 68,
+ 0x8bd: 68,
+ 0x8be: 68,
+ 0x8bf: 68,
+ 0x8c0: 68,
+ 0x8c1: 68,
+ 0x8c2: 68,
+ 0x8c3: 68,
+ 0x8c4: 68,
+ 0x8c5: 68,
+ 0x8c6: 68,
+ 0x8c7: 68,
+ 0x8e2: 85,
+ 0x1806: 85,
+ 0x1807: 68,
+ 0x180a: 67,
+ 0x180e: 85,
+ 0x1820: 68,
+ 0x1821: 68,
+ 0x1822: 68,
+ 0x1823: 68,
+ 0x1824: 68,
+ 0x1825: 68,
+ 0x1826: 68,
+ 0x1827: 68,
+ 0x1828: 68,
+ 0x1829: 68,
+ 0x182a: 68,
+ 0x182b: 68,
+ 0x182c: 68,
+ 0x182d: 68,
+ 0x182e: 68,
+ 0x182f: 68,
+ 0x1830: 68,
+ 0x1831: 68,
+ 0x1832: 68,
+ 0x1833: 68,
+ 0x1834: 68,
+ 0x1835: 68,
+ 0x1836: 68,
+ 0x1837: 68,
+ 0x1838: 68,
+ 0x1839: 68,
+ 0x183a: 68,
+ 0x183b: 68,
+ 0x183c: 68,
+ 0x183d: 68,
+ 0x183e: 68,
+ 0x183f: 68,
+ 0x1840: 68,
+ 0x1841: 68,
+ 0x1842: 68,
+ 0x1843: 68,
+ 0x1844: 68,
+ 0x1845: 68,
+ 0x1846: 68,
+ 0x1847: 68,
+ 0x1848: 68,
+ 0x1849: 68,
+ 0x184a: 68,
+ 0x184b: 68,
+ 0x184c: 68,
+ 0x184d: 68,
+ 0x184e: 68,
+ 0x184f: 68,
+ 0x1850: 68,
+ 0x1851: 68,
+ 0x1852: 68,
+ 0x1853: 68,
+ 0x1854: 68,
+ 0x1855: 68,
+ 0x1856: 68,
+ 0x1857: 68,
+ 0x1858: 68,
+ 0x1859: 68,
+ 0x185a: 68,
+ 0x185b: 68,
+ 0x185c: 68,
+ 0x185d: 68,
+ 0x185e: 68,
+ 0x185f: 68,
+ 0x1860: 68,
+ 0x1861: 68,
+ 0x1862: 68,
+ 0x1863: 68,
+ 0x1864: 68,
+ 0x1865: 68,
+ 0x1866: 68,
+ 0x1867: 68,
+ 0x1868: 68,
+ 0x1869: 68,
+ 0x186a: 68,
+ 0x186b: 68,
+ 0x186c: 68,
+ 0x186d: 68,
+ 0x186e: 68,
+ 0x186f: 68,
+ 0x1870: 68,
+ 0x1871: 68,
+ 0x1872: 68,
+ 0x1873: 68,
+ 0x1874: 68,
+ 0x1875: 68,
+ 0x1876: 68,
+ 0x1877: 68,
+ 0x1878: 68,
+ 0x1880: 85,
+ 0x1881: 85,
+ 0x1882: 85,
+ 0x1883: 85,
+ 0x1884: 85,
+ 0x1885: 84,
+ 0x1886: 84,
+ 0x1887: 68,
+ 0x1888: 68,
+ 0x1889: 68,
+ 0x188a: 68,
+ 0x188b: 68,
+ 0x188c: 68,
+ 0x188d: 68,
+ 0x188e: 68,
+ 0x188f: 68,
+ 0x1890: 68,
+ 0x1891: 68,
+ 0x1892: 68,
+ 0x1893: 68,
+ 0x1894: 68,
+ 0x1895: 68,
+ 0x1896: 68,
+ 0x1897: 68,
+ 0x1898: 68,
+ 0x1899: 68,
+ 0x189a: 68,
+ 0x189b: 68,
+ 0x189c: 68,
+ 0x189d: 68,
+ 0x189e: 68,
+ 0x189f: 68,
+ 0x18a0: 68,
+ 0x18a1: 68,
+ 0x18a2: 68,
+ 0x18a3: 68,
+ 0x18a4: 68,
+ 0x18a5: 68,
+ 0x18a6: 68,
+ 0x18a7: 68,
+ 0x18a8: 68,
+ 0x18aa: 68,
+ 0x200c: 85,
+ 0x200d: 67,
+ 0x202f: 85,
+ 0x2066: 85,
+ 0x2067: 85,
+ 0x2068: 85,
+ 0x2069: 85,
+ 0xa840: 68,
+ 0xa841: 68,
+ 0xa842: 68,
+ 0xa843: 68,
+ 0xa844: 68,
+ 0xa845: 68,
+ 0xa846: 68,
+ 0xa847: 68,
+ 0xa848: 68,
+ 0xa849: 68,
+ 0xa84a: 68,
+ 0xa84b: 68,
+ 0xa84c: 68,
+ 0xa84d: 68,
+ 0xa84e: 68,
+ 0xa84f: 68,
+ 0xa850: 68,
+ 0xa851: 68,
+ 0xa852: 68,
+ 0xa853: 68,
+ 0xa854: 68,
+ 0xa855: 68,
+ 0xa856: 68,
+ 0xa857: 68,
+ 0xa858: 68,
+ 0xa859: 68,
+ 0xa85a: 68,
+ 0xa85b: 68,
+ 0xa85c: 68,
+ 0xa85d: 68,
+ 0xa85e: 68,
+ 0xa85f: 68,
+ 0xa860: 68,
+ 0xa861: 68,
+ 0xa862: 68,
+ 0xa863: 68,
+ 0xa864: 68,
+ 0xa865: 68,
+ 0xa866: 68,
+ 0xa867: 68,
+ 0xa868: 68,
+ 0xa869: 68,
+ 0xa86a: 68,
+ 0xa86b: 68,
+ 0xa86c: 68,
+ 0xa86d: 68,
+ 0xa86e: 68,
+ 0xa86f: 68,
+ 0xa870: 68,
+ 0xa871: 68,
+ 0xa872: 76,
+ 0xa873: 85,
+ 0x10ac0: 68,
+ 0x10ac1: 68,
+ 0x10ac2: 68,
+ 0x10ac3: 68,
+ 0x10ac4: 68,
+ 0x10ac5: 82,
+ 0x10ac6: 85,
+ 0x10ac7: 82,
+ 0x10ac8: 85,
+ 0x10ac9: 82,
+ 0x10aca: 82,
+ 0x10acb: 85,
+ 0x10acc: 85,
+ 0x10acd: 76,
+ 0x10ace: 82,
+ 0x10acf: 82,
+ 0x10ad0: 82,
+ 0x10ad1: 82,
+ 0x10ad2: 82,
+ 0x10ad3: 68,
+ 0x10ad4: 68,
+ 0x10ad5: 68,
+ 0x10ad6: 68,
+ 0x10ad7: 76,
+ 0x10ad8: 68,
+ 0x10ad9: 68,
+ 0x10ada: 68,
+ 0x10adb: 68,
+ 0x10adc: 68,
+ 0x10add: 82,
+ 0x10ade: 68,
+ 0x10adf: 68,
+ 0x10ae0: 68,
+ 0x10ae1: 82,
+ 0x10ae2: 85,
+ 0x10ae3: 85,
+ 0x10ae4: 82,
+ 0x10aeb: 68,
+ 0x10aec: 68,
+ 0x10aed: 68,
+ 0x10aee: 68,
+ 0x10aef: 82,
+ 0x10b80: 68,
+ 0x10b81: 82,
+ 0x10b82: 68,
+ 0x10b83: 82,
+ 0x10b84: 82,
+ 0x10b85: 82,
+ 0x10b86: 68,
+ 0x10b87: 68,
+ 0x10b88: 68,
+ 0x10b89: 82,
+ 0x10b8a: 68,
+ 0x10b8b: 68,
+ 0x10b8c: 82,
+ 0x10b8d: 68,
+ 0x10b8e: 82,
+ 0x10b8f: 82,
+ 0x10b90: 68,
+ 0x10b91: 82,
+ 0x10ba9: 82,
+ 0x10baa: 82,
+ 0x10bab: 82,
+ 0x10bac: 82,
+ 0x10bad: 68,
+ 0x10bae: 68,
+ 0x10baf: 85,
+ 0x10d00: 76,
+ 0x10d01: 68,
+ 0x10d02: 68,
+ 0x10d03: 68,
+ 0x10d04: 68,
+ 0x10d05: 68,
+ 0x10d06: 68,
+ 0x10d07: 68,
+ 0x10d08: 68,
+ 0x10d09: 68,
+ 0x10d0a: 68,
+ 0x10d0b: 68,
+ 0x10d0c: 68,
+ 0x10d0d: 68,
+ 0x10d0e: 68,
+ 0x10d0f: 68,
+ 0x10d10: 68,
+ 0x10d11: 68,
+ 0x10d12: 68,
+ 0x10d13: 68,
+ 0x10d14: 68,
+ 0x10d15: 68,
+ 0x10d16: 68,
+ 0x10d17: 68,
+ 0x10d18: 68,
+ 0x10d19: 68,
+ 0x10d1a: 68,
+ 0x10d1b: 68,
+ 0x10d1c: 68,
+ 0x10d1d: 68,
+ 0x10d1e: 68,
+ 0x10d1f: 68,
+ 0x10d20: 68,
+ 0x10d21: 68,
+ 0x10d22: 82,
+ 0x10d23: 68,
+ 0x10f30: 68,
+ 0x10f31: 68,
+ 0x10f32: 68,
+ 0x10f33: 82,
+ 0x10f34: 68,
+ 0x10f35: 68,
+ 0x10f36: 68,
+ 0x10f37: 68,
+ 0x10f38: 68,
+ 0x10f39: 68,
+ 0x10f3a: 68,
+ 0x10f3b: 68,
+ 0x10f3c: 68,
+ 0x10f3d: 68,
+ 0x10f3e: 68,
+ 0x10f3f: 68,
+ 0x10f40: 68,
+ 0x10f41: 68,
+ 0x10f42: 68,
+ 0x10f43: 68,
+ 0x10f44: 68,
+ 0x10f45: 85,
+ 0x10f51: 68,
+ 0x10f52: 68,
+ 0x10f53: 68,
+ 0x10f54: 82,
+ 0x10fb0: 68,
+ 0x10fb1: 85,
+ 0x10fb2: 68,
+ 0x10fb3: 68,
+ 0x10fb4: 82,
+ 0x10fb5: 82,
+ 0x10fb6: 82,
+ 0x10fb7: 85,
+ 0x10fb8: 68,
+ 0x10fb9: 82,
+ 0x10fba: 82,
+ 0x10fbb: 68,
+ 0x10fbc: 68,
+ 0x10fbd: 82,
+ 0x10fbe: 68,
+ 0x10fbf: 68,
+ 0x10fc0: 85,
+ 0x10fc1: 68,
+ 0x10fc2: 82,
+ 0x10fc3: 82,
+ 0x10fc4: 68,
+ 0x10fc5: 85,
+ 0x10fc6: 85,
+ 0x10fc7: 85,
+ 0x10fc8: 85,
+ 0x10fc9: 82,
+ 0x10fca: 68,
+ 0x10fcb: 76,
+ 0x110bd: 85,
+ 0x110cd: 85,
+ 0x1e900: 68,
+ 0x1e901: 68,
+ 0x1e902: 68,
+ 0x1e903: 68,
+ 0x1e904: 68,
+ 0x1e905: 68,
+ 0x1e906: 68,
+ 0x1e907: 68,
+ 0x1e908: 68,
+ 0x1e909: 68,
+ 0x1e90a: 68,
+ 0x1e90b: 68,
+ 0x1e90c: 68,
+ 0x1e90d: 68,
+ 0x1e90e: 68,
+ 0x1e90f: 68,
+ 0x1e910: 68,
+ 0x1e911: 68,
+ 0x1e912: 68,
+ 0x1e913: 68,
+ 0x1e914: 68,
+ 0x1e915: 68,
+ 0x1e916: 68,
+ 0x1e917: 68,
+ 0x1e918: 68,
+ 0x1e919: 68,
+ 0x1e91a: 68,
+ 0x1e91b: 68,
+ 0x1e91c: 68,
+ 0x1e91d: 68,
+ 0x1e91e: 68,
+ 0x1e91f: 68,
+ 0x1e920: 68,
+ 0x1e921: 68,
+ 0x1e922: 68,
+ 0x1e923: 68,
+ 0x1e924: 68,
+ 0x1e925: 68,
+ 0x1e926: 68,
+ 0x1e927: 68,
+ 0x1e928: 68,
+ 0x1e929: 68,
+ 0x1e92a: 68,
+ 0x1e92b: 68,
+ 0x1e92c: 68,
+ 0x1e92d: 68,
+ 0x1e92e: 68,
+ 0x1e92f: 68,
+ 0x1e930: 68,
+ 0x1e931: 68,
+ 0x1e932: 68,
+ 0x1e933: 68,
+ 0x1e934: 68,
+ 0x1e935: 68,
+ 0x1e936: 68,
+ 0x1e937: 68,
+ 0x1e938: 68,
+ 0x1e939: 68,
+ 0x1e93a: 68,
+ 0x1e93b: 68,
+ 0x1e93c: 68,
+ 0x1e93d: 68,
+ 0x1e93e: 68,
+ 0x1e93f: 68,
+ 0x1e940: 68,
+ 0x1e941: 68,
+ 0x1e942: 68,
+ 0x1e943: 68,
+ 0x1e94b: 84,
+}
+codepoint_classes = {
+ 'PVALID': (
+ 0x2d0000002e,
+ 0x300000003a,
+ 0x610000007b,
+ 0xdf000000f7,
+ 0xf800000100,
+ 0x10100000102,
+ 0x10300000104,
+ 0x10500000106,
+ 0x10700000108,
+ 0x1090000010a,
+ 0x10b0000010c,
+ 0x10d0000010e,
+ 0x10f00000110,
+ 0x11100000112,
+ 0x11300000114,
+ 0x11500000116,
+ 0x11700000118,
+ 0x1190000011a,
+ 0x11b0000011c,
+ 0x11d0000011e,
+ 0x11f00000120,
+ 0x12100000122,
+ 0x12300000124,
+ 0x12500000126,
+ 0x12700000128,
+ 0x1290000012a,
+ 0x12b0000012c,
+ 0x12d0000012e,
+ 0x12f00000130,
+ 0x13100000132,
+ 0x13500000136,
+ 0x13700000139,
+ 0x13a0000013b,
+ 0x13c0000013d,
+ 0x13e0000013f,
+ 0x14200000143,
+ 0x14400000145,
+ 0x14600000147,
+ 0x14800000149,
+ 0x14b0000014c,
+ 0x14d0000014e,
+ 0x14f00000150,
+ 0x15100000152,
+ 0x15300000154,
+ 0x15500000156,
+ 0x15700000158,
+ 0x1590000015a,
+ 0x15b0000015c,
+ 0x15d0000015e,
+ 0x15f00000160,
+ 0x16100000162,
+ 0x16300000164,
+ 0x16500000166,
+ 0x16700000168,
+ 0x1690000016a,
+ 0x16b0000016c,
+ 0x16d0000016e,
+ 0x16f00000170,
+ 0x17100000172,
+ 0x17300000174,
+ 0x17500000176,
+ 0x17700000178,
+ 0x17a0000017b,
+ 0x17c0000017d,
+ 0x17e0000017f,
+ 0x18000000181,
+ 0x18300000184,
+ 0x18500000186,
+ 0x18800000189,
+ 0x18c0000018e,
+ 0x19200000193,
+ 0x19500000196,
+ 0x1990000019c,
+ 0x19e0000019f,
+ 0x1a1000001a2,
+ 0x1a3000001a4,
+ 0x1a5000001a6,
+ 0x1a8000001a9,
+ 0x1aa000001ac,
+ 0x1ad000001ae,
+ 0x1b0000001b1,
+ 0x1b4000001b5,
+ 0x1b6000001b7,
+ 0x1b9000001bc,
+ 0x1bd000001c4,
+ 0x1ce000001cf,
+ 0x1d0000001d1,
+ 0x1d2000001d3,
+ 0x1d4000001d5,
+ 0x1d6000001d7,
+ 0x1d8000001d9,
+ 0x1da000001db,
+ 0x1dc000001de,
+ 0x1df000001e0,
+ 0x1e1000001e2,
+ 0x1e3000001e4,
+ 0x1e5000001e6,
+ 0x1e7000001e8,
+ 0x1e9000001ea,
+ 0x1eb000001ec,
+ 0x1ed000001ee,
+ 0x1ef000001f1,
+ 0x1f5000001f6,
+ 0x1f9000001fa,
+ 0x1fb000001fc,
+ 0x1fd000001fe,
+ 0x1ff00000200,
+ 0x20100000202,
+ 0x20300000204,
+ 0x20500000206,
+ 0x20700000208,
+ 0x2090000020a,
+ 0x20b0000020c,
+ 0x20d0000020e,
+ 0x20f00000210,
+ 0x21100000212,
+ 0x21300000214,
+ 0x21500000216,
+ 0x21700000218,
+ 0x2190000021a,
+ 0x21b0000021c,
+ 0x21d0000021e,
+ 0x21f00000220,
+ 0x22100000222,
+ 0x22300000224,
+ 0x22500000226,
+ 0x22700000228,
+ 0x2290000022a,
+ 0x22b0000022c,
+ 0x22d0000022e,
+ 0x22f00000230,
+ 0x23100000232,
+ 0x2330000023a,
+ 0x23c0000023d,
+ 0x23f00000241,
+ 0x24200000243,
+ 0x24700000248,
+ 0x2490000024a,
+ 0x24b0000024c,
+ 0x24d0000024e,
+ 0x24f000002b0,
+ 0x2b9000002c2,
+ 0x2c6000002d2,
+ 0x2ec000002ed,
+ 0x2ee000002ef,
+ 0x30000000340,
+ 0x34200000343,
+ 0x3460000034f,
+ 0x35000000370,
+ 0x37100000372,
+ 0x37300000374,
+ 0x37700000378,
+ 0x37b0000037e,
+ 0x39000000391,
+ 0x3ac000003cf,
+ 0x3d7000003d8,
+ 0x3d9000003da,
+ 0x3db000003dc,
+ 0x3dd000003de,
+ 0x3df000003e0,
+ 0x3e1000003e2,
+ 0x3e3000003e4,
+ 0x3e5000003e6,
+ 0x3e7000003e8,
+ 0x3e9000003ea,
+ 0x3eb000003ec,
+ 0x3ed000003ee,
+ 0x3ef000003f0,
+ 0x3f3000003f4,
+ 0x3f8000003f9,
+ 0x3fb000003fd,
+ 0x43000000460,
+ 0x46100000462,
+ 0x46300000464,
+ 0x46500000466,
+ 0x46700000468,
+ 0x4690000046a,
+ 0x46b0000046c,
+ 0x46d0000046e,
+ 0x46f00000470,
+ 0x47100000472,
+ 0x47300000474,
+ 0x47500000476,
+ 0x47700000478,
+ 0x4790000047a,
+ 0x47b0000047c,
+ 0x47d0000047e,
+ 0x47f00000480,
+ 0x48100000482,
+ 0x48300000488,
+ 0x48b0000048c,
+ 0x48d0000048e,
+ 0x48f00000490,
+ 0x49100000492,
+ 0x49300000494,
+ 0x49500000496,
+ 0x49700000498,
+ 0x4990000049a,
+ 0x49b0000049c,
+ 0x49d0000049e,
+ 0x49f000004a0,
+ 0x4a1000004a2,
+ 0x4a3000004a4,
+ 0x4a5000004a6,
+ 0x4a7000004a8,
+ 0x4a9000004aa,
+ 0x4ab000004ac,
+ 0x4ad000004ae,
+ 0x4af000004b0,
+ 0x4b1000004b2,
+ 0x4b3000004b4,
+ 0x4b5000004b6,
+ 0x4b7000004b8,
+ 0x4b9000004ba,
+ 0x4bb000004bc,
+ 0x4bd000004be,
+ 0x4bf000004c0,
+ 0x4c2000004c3,
+ 0x4c4000004c5,
+ 0x4c6000004c7,
+ 0x4c8000004c9,
+ 0x4ca000004cb,
+ 0x4cc000004cd,
+ 0x4ce000004d0,
+ 0x4d1000004d2,
+ 0x4d3000004d4,
+ 0x4d5000004d6,
+ 0x4d7000004d8,
+ 0x4d9000004da,
+ 0x4db000004dc,
+ 0x4dd000004de,
+ 0x4df000004e0,
+ 0x4e1000004e2,
+ 0x4e3000004e4,
+ 0x4e5000004e6,
+ 0x4e7000004e8,
+ 0x4e9000004ea,
+ 0x4eb000004ec,
+ 0x4ed000004ee,
+ 0x4ef000004f0,
+ 0x4f1000004f2,
+ 0x4f3000004f4,
+ 0x4f5000004f6,
+ 0x4f7000004f8,
+ 0x4f9000004fa,
+ 0x4fb000004fc,
+ 0x4fd000004fe,
+ 0x4ff00000500,
+ 0x50100000502,
+ 0x50300000504,
+ 0x50500000506,
+ 0x50700000508,
+ 0x5090000050a,
+ 0x50b0000050c,
+ 0x50d0000050e,
+ 0x50f00000510,
+ 0x51100000512,
+ 0x51300000514,
+ 0x51500000516,
+ 0x51700000518,
+ 0x5190000051a,
+ 0x51b0000051c,
+ 0x51d0000051e,
+ 0x51f00000520,
+ 0x52100000522,
+ 0x52300000524,
+ 0x52500000526,
+ 0x52700000528,
+ 0x5290000052a,
+ 0x52b0000052c,
+ 0x52d0000052e,
+ 0x52f00000530,
+ 0x5590000055a,
+ 0x56000000587,
+ 0x58800000589,
+ 0x591000005be,
+ 0x5bf000005c0,
+ 0x5c1000005c3,
+ 0x5c4000005c6,
+ 0x5c7000005c8,
+ 0x5d0000005eb,
+ 0x5ef000005f3,
+ 0x6100000061b,
+ 0x62000000640,
+ 0x64100000660,
+ 0x66e00000675,
+ 0x679000006d4,
+ 0x6d5000006dd,
+ 0x6df000006e9,
+ 0x6ea000006f0,
+ 0x6fa00000700,
+ 0x7100000074b,
+ 0x74d000007b2,
+ 0x7c0000007f6,
+ 0x7fd000007fe,
+ 0x8000000082e,
+ 0x8400000085c,
+ 0x8600000086b,
+ 0x8a0000008b5,
+ 0x8b6000008c8,
+ 0x8d3000008e2,
+ 0x8e300000958,
+ 0x96000000964,
+ 0x96600000970,
+ 0x97100000984,
+ 0x9850000098d,
+ 0x98f00000991,
+ 0x993000009a9,
+ 0x9aa000009b1,
+ 0x9b2000009b3,
+ 0x9b6000009ba,
+ 0x9bc000009c5,
+ 0x9c7000009c9,
+ 0x9cb000009cf,
+ 0x9d7000009d8,
+ 0x9e0000009e4,
+ 0x9e6000009f2,
+ 0x9fc000009fd,
+ 0x9fe000009ff,
+ 0xa0100000a04,
+ 0xa0500000a0b,
+ 0xa0f00000a11,
+ 0xa1300000a29,
+ 0xa2a00000a31,
+ 0xa3200000a33,
+ 0xa3500000a36,
+ 0xa3800000a3a,
+ 0xa3c00000a3d,
+ 0xa3e00000a43,
+ 0xa4700000a49,
+ 0xa4b00000a4e,
+ 0xa5100000a52,
+ 0xa5c00000a5d,
+ 0xa6600000a76,
+ 0xa8100000a84,
+ 0xa8500000a8e,
+ 0xa8f00000a92,
+ 0xa9300000aa9,
+ 0xaaa00000ab1,
+ 0xab200000ab4,
+ 0xab500000aba,
+ 0xabc00000ac6,
+ 0xac700000aca,
+ 0xacb00000ace,
+ 0xad000000ad1,
+ 0xae000000ae4,
+ 0xae600000af0,
+ 0xaf900000b00,
+ 0xb0100000b04,
+ 0xb0500000b0d,
+ 0xb0f00000b11,
+ 0xb1300000b29,
+ 0xb2a00000b31,
+ 0xb3200000b34,
+ 0xb3500000b3a,
+ 0xb3c00000b45,
+ 0xb4700000b49,
+ 0xb4b00000b4e,
+ 0xb5500000b58,
+ 0xb5f00000b64,
+ 0xb6600000b70,
+ 0xb7100000b72,
+ 0xb8200000b84,
+ 0xb8500000b8b,
+ 0xb8e00000b91,
+ 0xb9200000b96,
+ 0xb9900000b9b,
+ 0xb9c00000b9d,
+ 0xb9e00000ba0,
+ 0xba300000ba5,
+ 0xba800000bab,
+ 0xbae00000bba,
+ 0xbbe00000bc3,
+ 0xbc600000bc9,
+ 0xbca00000bce,
+ 0xbd000000bd1,
+ 0xbd700000bd8,
+ 0xbe600000bf0,
+ 0xc0000000c0d,
+ 0xc0e00000c11,
+ 0xc1200000c29,
+ 0xc2a00000c3a,
+ 0xc3d00000c45,
+ 0xc4600000c49,
+ 0xc4a00000c4e,
+ 0xc5500000c57,
+ 0xc5800000c5b,
+ 0xc6000000c64,
+ 0xc6600000c70,
+ 0xc8000000c84,
+ 0xc8500000c8d,
+ 0xc8e00000c91,
+ 0xc9200000ca9,
+ 0xcaa00000cb4,
+ 0xcb500000cba,
+ 0xcbc00000cc5,
+ 0xcc600000cc9,
+ 0xcca00000cce,
+ 0xcd500000cd7,
+ 0xcde00000cdf,
+ 0xce000000ce4,
+ 0xce600000cf0,
+ 0xcf100000cf3,
+ 0xd0000000d0d,
+ 0xd0e00000d11,
+ 0xd1200000d45,
+ 0xd4600000d49,
+ 0xd4a00000d4f,
+ 0xd5400000d58,
+ 0xd5f00000d64,
+ 0xd6600000d70,
+ 0xd7a00000d80,
+ 0xd8100000d84,
+ 0xd8500000d97,
+ 0xd9a00000db2,
+ 0xdb300000dbc,
+ 0xdbd00000dbe,
+ 0xdc000000dc7,
+ 0xdca00000dcb,
+ 0xdcf00000dd5,
+ 0xdd600000dd7,
+ 0xdd800000de0,
+ 0xde600000df0,
+ 0xdf200000df4,
+ 0xe0100000e33,
+ 0xe3400000e3b,
+ 0xe4000000e4f,
+ 0xe5000000e5a,
+ 0xe8100000e83,
+ 0xe8400000e85,
+ 0xe8600000e8b,
+ 0xe8c00000ea4,
+ 0xea500000ea6,
+ 0xea700000eb3,
+ 0xeb400000ebe,
+ 0xec000000ec5,
+ 0xec600000ec7,
+ 0xec800000ece,
+ 0xed000000eda,
+ 0xede00000ee0,
+ 0xf0000000f01,
+ 0xf0b00000f0c,
+ 0xf1800000f1a,
+ 0xf2000000f2a,
+ 0xf3500000f36,
+ 0xf3700000f38,
+ 0xf3900000f3a,
+ 0xf3e00000f43,
+ 0xf4400000f48,
+ 0xf4900000f4d,
+ 0xf4e00000f52,
+ 0xf5300000f57,
+ 0xf5800000f5c,
+ 0xf5d00000f69,
+ 0xf6a00000f6d,
+ 0xf7100000f73,
+ 0xf7400000f75,
+ 0xf7a00000f81,
+ 0xf8200000f85,
+ 0xf8600000f93,
+ 0xf9400000f98,
+ 0xf9900000f9d,
+ 0xf9e00000fa2,
+ 0xfa300000fa7,
+ 0xfa800000fac,
+ 0xfad00000fb9,
+ 0xfba00000fbd,
+ 0xfc600000fc7,
+ 0x10000000104a,
+ 0x10500000109e,
+ 0x10d0000010fb,
+ 0x10fd00001100,
+ 0x120000001249,
+ 0x124a0000124e,
+ 0x125000001257,
+ 0x125800001259,
+ 0x125a0000125e,
+ 0x126000001289,
+ 0x128a0000128e,
+ 0x1290000012b1,
+ 0x12b2000012b6,
+ 0x12b8000012bf,
+ 0x12c0000012c1,
+ 0x12c2000012c6,
+ 0x12c8000012d7,
+ 0x12d800001311,
+ 0x131200001316,
+ 0x13180000135b,
+ 0x135d00001360,
+ 0x138000001390,
+ 0x13a0000013f6,
+ 0x14010000166d,
+ 0x166f00001680,
+ 0x16810000169b,
+ 0x16a0000016eb,
+ 0x16f1000016f9,
+ 0x17000000170d,
+ 0x170e00001715,
+ 0x172000001735,
+ 0x174000001754,
+ 0x17600000176d,
+ 0x176e00001771,
+ 0x177200001774,
+ 0x1780000017b4,
+ 0x17b6000017d4,
+ 0x17d7000017d8,
+ 0x17dc000017de,
+ 0x17e0000017ea,
+ 0x18100000181a,
+ 0x182000001879,
+ 0x1880000018ab,
+ 0x18b0000018f6,
+ 0x19000000191f,
+ 0x19200000192c,
+ 0x19300000193c,
+ 0x19460000196e,
+ 0x197000001975,
+ 0x1980000019ac,
+ 0x19b0000019ca,
+ 0x19d0000019da,
+ 0x1a0000001a1c,
+ 0x1a2000001a5f,
+ 0x1a6000001a7d,
+ 0x1a7f00001a8a,
+ 0x1a9000001a9a,
+ 0x1aa700001aa8,
+ 0x1ab000001abe,
+ 0x1abf00001ac1,
+ 0x1b0000001b4c,
+ 0x1b5000001b5a,
+ 0x1b6b00001b74,
+ 0x1b8000001bf4,
+ 0x1c0000001c38,
+ 0x1c4000001c4a,
+ 0x1c4d00001c7e,
+ 0x1cd000001cd3,
+ 0x1cd400001cfb,
+ 0x1d0000001d2c,
+ 0x1d2f00001d30,
+ 0x1d3b00001d3c,
+ 0x1d4e00001d4f,
+ 0x1d6b00001d78,
+ 0x1d7900001d9b,
+ 0x1dc000001dfa,
+ 0x1dfb00001e00,
+ 0x1e0100001e02,
+ 0x1e0300001e04,
+ 0x1e0500001e06,
+ 0x1e0700001e08,
+ 0x1e0900001e0a,
+ 0x1e0b00001e0c,
+ 0x1e0d00001e0e,
+ 0x1e0f00001e10,
+ 0x1e1100001e12,
+ 0x1e1300001e14,
+ 0x1e1500001e16,
+ 0x1e1700001e18,
+ 0x1e1900001e1a,
+ 0x1e1b00001e1c,
+ 0x1e1d00001e1e,
+ 0x1e1f00001e20,
+ 0x1e2100001e22,
+ 0x1e2300001e24,
+ 0x1e2500001e26,
+ 0x1e2700001e28,
+ 0x1e2900001e2a,
+ 0x1e2b00001e2c,
+ 0x1e2d00001e2e,
+ 0x1e2f00001e30,
+ 0x1e3100001e32,
+ 0x1e3300001e34,
+ 0x1e3500001e36,
+ 0x1e3700001e38,
+ 0x1e3900001e3a,
+ 0x1e3b00001e3c,
+ 0x1e3d00001e3e,
+ 0x1e3f00001e40,
+ 0x1e4100001e42,
+ 0x1e4300001e44,
+ 0x1e4500001e46,
+ 0x1e4700001e48,
+ 0x1e4900001e4a,
+ 0x1e4b00001e4c,
+ 0x1e4d00001e4e,
+ 0x1e4f00001e50,
+ 0x1e5100001e52,
+ 0x1e5300001e54,
+ 0x1e5500001e56,
+ 0x1e5700001e58,
+ 0x1e5900001e5a,
+ 0x1e5b00001e5c,
+ 0x1e5d00001e5e,
+ 0x1e5f00001e60,
+ 0x1e6100001e62,
+ 0x1e6300001e64,
+ 0x1e6500001e66,
+ 0x1e6700001e68,
+ 0x1e6900001e6a,
+ 0x1e6b00001e6c,
+ 0x1e6d00001e6e,
+ 0x1e6f00001e70,
+ 0x1e7100001e72,
+ 0x1e7300001e74,
+ 0x1e7500001e76,
+ 0x1e7700001e78,
+ 0x1e7900001e7a,
+ 0x1e7b00001e7c,
+ 0x1e7d00001e7e,
+ 0x1e7f00001e80,
+ 0x1e8100001e82,
+ 0x1e8300001e84,
+ 0x1e8500001e86,
+ 0x1e8700001e88,
+ 0x1e8900001e8a,
+ 0x1e8b00001e8c,
+ 0x1e8d00001e8e,
+ 0x1e8f00001e90,
+ 0x1e9100001e92,
+ 0x1e9300001e94,
+ 0x1e9500001e9a,
+ 0x1e9c00001e9e,
+ 0x1e9f00001ea0,
+ 0x1ea100001ea2,
+ 0x1ea300001ea4,
+ 0x1ea500001ea6,
+ 0x1ea700001ea8,
+ 0x1ea900001eaa,
+ 0x1eab00001eac,
+ 0x1ead00001eae,
+ 0x1eaf00001eb0,
+ 0x1eb100001eb2,
+ 0x1eb300001eb4,
+ 0x1eb500001eb6,
+ 0x1eb700001eb8,
+ 0x1eb900001eba,
+ 0x1ebb00001ebc,
+ 0x1ebd00001ebe,
+ 0x1ebf00001ec0,
+ 0x1ec100001ec2,
+ 0x1ec300001ec4,
+ 0x1ec500001ec6,
+ 0x1ec700001ec8,
+ 0x1ec900001eca,
+ 0x1ecb00001ecc,
+ 0x1ecd00001ece,
+ 0x1ecf00001ed0,
+ 0x1ed100001ed2,
+ 0x1ed300001ed4,
+ 0x1ed500001ed6,
+ 0x1ed700001ed8,
+ 0x1ed900001eda,
+ 0x1edb00001edc,
+ 0x1edd00001ede,
+ 0x1edf00001ee0,
+ 0x1ee100001ee2,
+ 0x1ee300001ee4,
+ 0x1ee500001ee6,
+ 0x1ee700001ee8,
+ 0x1ee900001eea,
+ 0x1eeb00001eec,
+ 0x1eed00001eee,
+ 0x1eef00001ef0,
+ 0x1ef100001ef2,
+ 0x1ef300001ef4,
+ 0x1ef500001ef6,
+ 0x1ef700001ef8,
+ 0x1ef900001efa,
+ 0x1efb00001efc,
+ 0x1efd00001efe,
+ 0x1eff00001f08,
+ 0x1f1000001f16,
+ 0x1f2000001f28,
+ 0x1f3000001f38,
+ 0x1f4000001f46,
+ 0x1f5000001f58,
+ 0x1f6000001f68,
+ 0x1f7000001f71,
+ 0x1f7200001f73,
+ 0x1f7400001f75,
+ 0x1f7600001f77,
+ 0x1f7800001f79,
+ 0x1f7a00001f7b,
+ 0x1f7c00001f7d,
+ 0x1fb000001fb2,
+ 0x1fb600001fb7,
+ 0x1fc600001fc7,
+ 0x1fd000001fd3,
+ 0x1fd600001fd8,
+ 0x1fe000001fe3,
+ 0x1fe400001fe8,
+ 0x1ff600001ff7,
+ 0x214e0000214f,
+ 0x218400002185,
+ 0x2c3000002c5f,
+ 0x2c6100002c62,
+ 0x2c6500002c67,
+ 0x2c6800002c69,
+ 0x2c6a00002c6b,
+ 0x2c6c00002c6d,
+ 0x2c7100002c72,
+ 0x2c7300002c75,
+ 0x2c7600002c7c,
+ 0x2c8100002c82,
+ 0x2c8300002c84,
+ 0x2c8500002c86,
+ 0x2c8700002c88,
+ 0x2c8900002c8a,
+ 0x2c8b00002c8c,
+ 0x2c8d00002c8e,
+ 0x2c8f00002c90,
+ 0x2c9100002c92,
+ 0x2c9300002c94,
+ 0x2c9500002c96,
+ 0x2c9700002c98,
+ 0x2c9900002c9a,
+ 0x2c9b00002c9c,
+ 0x2c9d00002c9e,
+ 0x2c9f00002ca0,
+ 0x2ca100002ca2,
+ 0x2ca300002ca4,
+ 0x2ca500002ca6,
+ 0x2ca700002ca8,
+ 0x2ca900002caa,
+ 0x2cab00002cac,
+ 0x2cad00002cae,
+ 0x2caf00002cb0,
+ 0x2cb100002cb2,
+ 0x2cb300002cb4,
+ 0x2cb500002cb6,
+ 0x2cb700002cb8,
+ 0x2cb900002cba,
+ 0x2cbb00002cbc,
+ 0x2cbd00002cbe,
+ 0x2cbf00002cc0,
+ 0x2cc100002cc2,
+ 0x2cc300002cc4,
+ 0x2cc500002cc6,
+ 0x2cc700002cc8,
+ 0x2cc900002cca,
+ 0x2ccb00002ccc,
+ 0x2ccd00002cce,
+ 0x2ccf00002cd0,
+ 0x2cd100002cd2,
+ 0x2cd300002cd4,
+ 0x2cd500002cd6,
+ 0x2cd700002cd8,
+ 0x2cd900002cda,
+ 0x2cdb00002cdc,
+ 0x2cdd00002cde,
+ 0x2cdf00002ce0,
+ 0x2ce100002ce2,
+ 0x2ce300002ce5,
+ 0x2cec00002ced,
+ 0x2cee00002cf2,
+ 0x2cf300002cf4,
+ 0x2d0000002d26,
+ 0x2d2700002d28,
+ 0x2d2d00002d2e,
+ 0x2d3000002d68,
+ 0x2d7f00002d97,
+ 0x2da000002da7,
+ 0x2da800002daf,
+ 0x2db000002db7,
+ 0x2db800002dbf,
+ 0x2dc000002dc7,
+ 0x2dc800002dcf,
+ 0x2dd000002dd7,
+ 0x2dd800002ddf,
+ 0x2de000002e00,
+ 0x2e2f00002e30,
+ 0x300500003008,
+ 0x302a0000302e,
+ 0x303c0000303d,
+ 0x304100003097,
+ 0x30990000309b,
+ 0x309d0000309f,
+ 0x30a1000030fb,
+ 0x30fc000030ff,
+ 0x310500003130,
+ 0x31a0000031c0,
+ 0x31f000003200,
+ 0x340000004dc0,
+ 0x4e0000009ffd,
+ 0xa0000000a48d,
+ 0xa4d00000a4fe,
+ 0xa5000000a60d,
+ 0xa6100000a62c,
+ 0xa6410000a642,
+ 0xa6430000a644,
+ 0xa6450000a646,
+ 0xa6470000a648,
+ 0xa6490000a64a,
+ 0xa64b0000a64c,
+ 0xa64d0000a64e,
+ 0xa64f0000a650,
+ 0xa6510000a652,
+ 0xa6530000a654,
+ 0xa6550000a656,
+ 0xa6570000a658,
+ 0xa6590000a65a,
+ 0xa65b0000a65c,
+ 0xa65d0000a65e,
+ 0xa65f0000a660,
+ 0xa6610000a662,
+ 0xa6630000a664,
+ 0xa6650000a666,
+ 0xa6670000a668,
+ 0xa6690000a66a,
+ 0xa66b0000a66c,
+ 0xa66d0000a670,
+ 0xa6740000a67e,
+ 0xa67f0000a680,
+ 0xa6810000a682,
+ 0xa6830000a684,
+ 0xa6850000a686,
+ 0xa6870000a688,
+ 0xa6890000a68a,
+ 0xa68b0000a68c,
+ 0xa68d0000a68e,
+ 0xa68f0000a690,
+ 0xa6910000a692,
+ 0xa6930000a694,
+ 0xa6950000a696,
+ 0xa6970000a698,
+ 0xa6990000a69a,
+ 0xa69b0000a69c,
+ 0xa69e0000a6e6,
+ 0xa6f00000a6f2,
+ 0xa7170000a720,
+ 0xa7230000a724,
+ 0xa7250000a726,
+ 0xa7270000a728,
+ 0xa7290000a72a,
+ 0xa72b0000a72c,
+ 0xa72d0000a72e,
+ 0xa72f0000a732,
+ 0xa7330000a734,
+ 0xa7350000a736,
+ 0xa7370000a738,
+ 0xa7390000a73a,
+ 0xa73b0000a73c,
+ 0xa73d0000a73e,
+ 0xa73f0000a740,
+ 0xa7410000a742,
+ 0xa7430000a744,
+ 0xa7450000a746,
+ 0xa7470000a748,
+ 0xa7490000a74a,
+ 0xa74b0000a74c,
+ 0xa74d0000a74e,
+ 0xa74f0000a750,
+ 0xa7510000a752,
+ 0xa7530000a754,
+ 0xa7550000a756,
+ 0xa7570000a758,
+ 0xa7590000a75a,
+ 0xa75b0000a75c,
+ 0xa75d0000a75e,
+ 0xa75f0000a760,
+ 0xa7610000a762,
+ 0xa7630000a764,
+ 0xa7650000a766,
+ 0xa7670000a768,
+ 0xa7690000a76a,
+ 0xa76b0000a76c,
+ 0xa76d0000a76e,
+ 0xa76f0000a770,
+ 0xa7710000a779,
+ 0xa77a0000a77b,
+ 0xa77c0000a77d,
+ 0xa77f0000a780,
+ 0xa7810000a782,
+ 0xa7830000a784,
+ 0xa7850000a786,
+ 0xa7870000a789,
+ 0xa78c0000a78d,
+ 0xa78e0000a790,
+ 0xa7910000a792,
+ 0xa7930000a796,
+ 0xa7970000a798,
+ 0xa7990000a79a,
+ 0xa79b0000a79c,
+ 0xa79d0000a79e,
+ 0xa79f0000a7a0,
+ 0xa7a10000a7a2,
+ 0xa7a30000a7a4,
+ 0xa7a50000a7a6,
+ 0xa7a70000a7a8,
+ 0xa7a90000a7aa,
+ 0xa7af0000a7b0,
+ 0xa7b50000a7b6,
+ 0xa7b70000a7b8,
+ 0xa7b90000a7ba,
+ 0xa7bb0000a7bc,
+ 0xa7bd0000a7be,
+ 0xa7bf0000a7c0,
+ 0xa7c30000a7c4,
+ 0xa7c80000a7c9,
+ 0xa7ca0000a7cb,
+ 0xa7f60000a7f8,
+ 0xa7fa0000a828,
+ 0xa82c0000a82d,
+ 0xa8400000a874,
+ 0xa8800000a8c6,
+ 0xa8d00000a8da,
+ 0xa8e00000a8f8,
+ 0xa8fb0000a8fc,
+ 0xa8fd0000a92e,
+ 0xa9300000a954,
+ 0xa9800000a9c1,
+ 0xa9cf0000a9da,
+ 0xa9e00000a9ff,
+ 0xaa000000aa37,
+ 0xaa400000aa4e,
+ 0xaa500000aa5a,
+ 0xaa600000aa77,
+ 0xaa7a0000aac3,
+ 0xaadb0000aade,
+ 0xaae00000aaf0,
+ 0xaaf20000aaf7,
+ 0xab010000ab07,
+ 0xab090000ab0f,
+ 0xab110000ab17,
+ 0xab200000ab27,
+ 0xab280000ab2f,
+ 0xab300000ab5b,
+ 0xab600000ab6a,
+ 0xabc00000abeb,
+ 0xabec0000abee,
+ 0xabf00000abfa,
+ 0xac000000d7a4,
+ 0xfa0e0000fa10,
+ 0xfa110000fa12,
+ 0xfa130000fa15,
+ 0xfa1f0000fa20,
+ 0xfa210000fa22,
+ 0xfa230000fa25,
+ 0xfa270000fa2a,
+ 0xfb1e0000fb1f,
+ 0xfe200000fe30,
+ 0xfe730000fe74,
+ 0x100000001000c,
+ 0x1000d00010027,
+ 0x100280001003b,
+ 0x1003c0001003e,
+ 0x1003f0001004e,
+ 0x100500001005e,
+ 0x10080000100fb,
+ 0x101fd000101fe,
+ 0x102800001029d,
+ 0x102a0000102d1,
+ 0x102e0000102e1,
+ 0x1030000010320,
+ 0x1032d00010341,
+ 0x103420001034a,
+ 0x103500001037b,
+ 0x103800001039e,
+ 0x103a0000103c4,
+ 0x103c8000103d0,
+ 0x104280001049e,
+ 0x104a0000104aa,
+ 0x104d8000104fc,
+ 0x1050000010528,
+ 0x1053000010564,
+ 0x1060000010737,
+ 0x1074000010756,
+ 0x1076000010768,
+ 0x1080000010806,
+ 0x1080800010809,
+ 0x1080a00010836,
+ 0x1083700010839,
+ 0x1083c0001083d,
+ 0x1083f00010856,
+ 0x1086000010877,
+ 0x108800001089f,
+ 0x108e0000108f3,
+ 0x108f4000108f6,
+ 0x1090000010916,
+ 0x109200001093a,
+ 0x10980000109b8,
+ 0x109be000109c0,
+ 0x10a0000010a04,
+ 0x10a0500010a07,
+ 0x10a0c00010a14,
+ 0x10a1500010a18,
+ 0x10a1900010a36,
+ 0x10a3800010a3b,
+ 0x10a3f00010a40,
+ 0x10a6000010a7d,
+ 0x10a8000010a9d,
+ 0x10ac000010ac8,
+ 0x10ac900010ae7,
+ 0x10b0000010b36,
+ 0x10b4000010b56,
+ 0x10b6000010b73,
+ 0x10b8000010b92,
+ 0x10c0000010c49,
+ 0x10cc000010cf3,
+ 0x10d0000010d28,
+ 0x10d3000010d3a,
+ 0x10e8000010eaa,
+ 0x10eab00010ead,
+ 0x10eb000010eb2,
+ 0x10f0000010f1d,
+ 0x10f2700010f28,
+ 0x10f3000010f51,
+ 0x10fb000010fc5,
+ 0x10fe000010ff7,
+ 0x1100000011047,
+ 0x1106600011070,
+ 0x1107f000110bb,
+ 0x110d0000110e9,
+ 0x110f0000110fa,
+ 0x1110000011135,
+ 0x1113600011140,
+ 0x1114400011148,
+ 0x1115000011174,
+ 0x1117600011177,
+ 0x11180000111c5,
+ 0x111c9000111cd,
+ 0x111ce000111db,
+ 0x111dc000111dd,
+ 0x1120000011212,
+ 0x1121300011238,
+ 0x1123e0001123f,
+ 0x1128000011287,
+ 0x1128800011289,
+ 0x1128a0001128e,
+ 0x1128f0001129e,
+ 0x1129f000112a9,
+ 0x112b0000112eb,
+ 0x112f0000112fa,
+ 0x1130000011304,
+ 0x113050001130d,
+ 0x1130f00011311,
+ 0x1131300011329,
+ 0x1132a00011331,
+ 0x1133200011334,
+ 0x113350001133a,
+ 0x1133b00011345,
+ 0x1134700011349,
+ 0x1134b0001134e,
+ 0x1135000011351,
+ 0x1135700011358,
+ 0x1135d00011364,
+ 0x113660001136d,
+ 0x1137000011375,
+ 0x114000001144b,
+ 0x114500001145a,
+ 0x1145e00011462,
+ 0x11480000114c6,
+ 0x114c7000114c8,
+ 0x114d0000114da,
+ 0x11580000115b6,
+ 0x115b8000115c1,
+ 0x115d8000115de,
+ 0x1160000011641,
+ 0x1164400011645,
+ 0x116500001165a,
+ 0x11680000116b9,
+ 0x116c0000116ca,
+ 0x117000001171b,
+ 0x1171d0001172c,
+ 0x117300001173a,
+ 0x118000001183b,
+ 0x118c0000118ea,
+ 0x118ff00011907,
+ 0x119090001190a,
+ 0x1190c00011914,
+ 0x1191500011917,
+ 0x1191800011936,
+ 0x1193700011939,
+ 0x1193b00011944,
+ 0x119500001195a,
+ 0x119a0000119a8,
+ 0x119aa000119d8,
+ 0x119da000119e2,
+ 0x119e3000119e5,
+ 0x11a0000011a3f,
+ 0x11a4700011a48,
+ 0x11a5000011a9a,
+ 0x11a9d00011a9e,
+ 0x11ac000011af9,
+ 0x11c0000011c09,
+ 0x11c0a00011c37,
+ 0x11c3800011c41,
+ 0x11c5000011c5a,
+ 0x11c7200011c90,
+ 0x11c9200011ca8,
+ 0x11ca900011cb7,
+ 0x11d0000011d07,
+ 0x11d0800011d0a,
+ 0x11d0b00011d37,
+ 0x11d3a00011d3b,
+ 0x11d3c00011d3e,
+ 0x11d3f00011d48,
+ 0x11d5000011d5a,
+ 0x11d6000011d66,
+ 0x11d6700011d69,
+ 0x11d6a00011d8f,
+ 0x11d9000011d92,
+ 0x11d9300011d99,
+ 0x11da000011daa,
+ 0x11ee000011ef7,
+ 0x11fb000011fb1,
+ 0x120000001239a,
+ 0x1248000012544,
+ 0x130000001342f,
+ 0x1440000014647,
+ 0x1680000016a39,
+ 0x16a4000016a5f,
+ 0x16a6000016a6a,
+ 0x16ad000016aee,
+ 0x16af000016af5,
+ 0x16b0000016b37,
+ 0x16b4000016b44,
+ 0x16b5000016b5a,
+ 0x16b6300016b78,
+ 0x16b7d00016b90,
+ 0x16e6000016e80,
+ 0x16f0000016f4b,
+ 0x16f4f00016f88,
+ 0x16f8f00016fa0,
+ 0x16fe000016fe2,
+ 0x16fe300016fe5,
+ 0x16ff000016ff2,
+ 0x17000000187f8,
+ 0x1880000018cd6,
+ 0x18d0000018d09,
+ 0x1b0000001b11f,
+ 0x1b1500001b153,
+ 0x1b1640001b168,
+ 0x1b1700001b2fc,
+ 0x1bc000001bc6b,
+ 0x1bc700001bc7d,
+ 0x1bc800001bc89,
+ 0x1bc900001bc9a,
+ 0x1bc9d0001bc9f,
+ 0x1da000001da37,
+ 0x1da3b0001da6d,
+ 0x1da750001da76,
+ 0x1da840001da85,
+ 0x1da9b0001daa0,
+ 0x1daa10001dab0,
+ 0x1e0000001e007,
+ 0x1e0080001e019,
+ 0x1e01b0001e022,
+ 0x1e0230001e025,
+ 0x1e0260001e02b,
+ 0x1e1000001e12d,
+ 0x1e1300001e13e,
+ 0x1e1400001e14a,
+ 0x1e14e0001e14f,
+ 0x1e2c00001e2fa,
+ 0x1e8000001e8c5,
+ 0x1e8d00001e8d7,
+ 0x1e9220001e94c,
+ 0x1e9500001e95a,
+ 0x1fbf00001fbfa,
+ 0x200000002a6de,
+ 0x2a7000002b735,
+ 0x2b7400002b81e,
+ 0x2b8200002cea2,
+ 0x2ceb00002ebe1,
+ 0x300000003134b,
+ ),
+ 'CONTEXTJ': (
+ 0x200c0000200e,
+ ),
+ 'CONTEXTO': (
+ 0xb7000000b8,
+ 0x37500000376,
+ 0x5f3000005f5,
+ 0x6600000066a,
+ 0x6f0000006fa,
+ 0x30fb000030fc,
+ ),
+}
diff --git a/.venv/lib/python3.9/site-packages/idna/intranges.py b/.venv/lib/python3.9/site-packages/idna/intranges.py
new file mode 100644
index 0000000..fa8a735
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/idna/intranges.py
@@ -0,0 +1,53 @@
+"""
+Given a list of integers, made up of (hopefully) a small number of long runs
+of consecutive integers, compute a representation of the form
+((start1, end1), (start2, end2) ...). Then answer the question "was x present
+in the original list?" in time O(log(# runs)).
+"""
+
+import bisect
+
+def intranges_from_list(list_):
+ """Represent a list of integers as a sequence of ranges:
+ ((start_0, end_0), (start_1, end_1), ...), such that the original
+ integers are exactly those x such that start_i <= x < end_i for some i.
+
+ Ranges are encoded as single integers (start << 32 | end), not as tuples.
+ """
+
+ sorted_list = sorted(list_)
+ ranges = []
+ last_write = -1
+ for i in range(len(sorted_list)):
+ if i+1 < len(sorted_list):
+ if sorted_list[i] == sorted_list[i+1]-1:
+ continue
+ current_range = sorted_list[last_write+1:i+1]
+ ranges.append(_encode_range(current_range[0], current_range[-1] + 1))
+ last_write = i
+
+ return tuple(ranges)
+
+def _encode_range(start, end):
+ return (start << 32) | end
+
+def _decode_range(r):
+ return (r >> 32), (r & ((1 << 32) - 1))
+
+
+def intranges_contain(int_, ranges):
+ """Determine if `int_` falls into one of the ranges in `ranges`."""
+ tuple_ = _encode_range(int_, 0)
+ pos = bisect.bisect_left(ranges, tuple_)
+ # we could be immediately ahead of a tuple (start, end)
+ # with start < int_ <= end
+ if pos > 0:
+ left, right = _decode_range(ranges[pos-1])
+ if left <= int_ < right:
+ return True
+ # or we could be immediately behind a tuple (int_, end)
+ if pos < len(ranges):
+ left, _ = _decode_range(ranges[pos])
+ if left == int_:
+ return True
+ return False
diff --git a/.venv/lib/python3.9/site-packages/idna/package_data.py b/.venv/lib/python3.9/site-packages/idna/package_data.py
new file mode 100644
index 0000000..ce1c521
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/idna/package_data.py
@@ -0,0 +1,2 @@
+__version__ = '2.10'
+
diff --git a/.venv/lib/python3.9/site-packages/idna/uts46data.py b/.venv/lib/python3.9/site-packages/idna/uts46data.py
new file mode 100644
index 0000000..3766dd4
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/idna/uts46data.py
@@ -0,0 +1,8357 @@
+# This file is automatically generated by tools/idna-data
+# vim: set fileencoding=utf-8 :
+
+"""IDNA Mapping Table from UTS46."""
+
+
+__version__ = "13.0.0"
+def _seg_0():
+ return [
+ (0x0, '3'),
+ (0x1, '3'),
+ (0x2, '3'),
+ (0x3, '3'),
+ (0x4, '3'),
+ (0x5, '3'),
+ (0x6, '3'),
+ (0x7, '3'),
+ (0x8, '3'),
+ (0x9, '3'),
+ (0xA, '3'),
+ (0xB, '3'),
+ (0xC, '3'),
+ (0xD, '3'),
+ (0xE, '3'),
+ (0xF, '3'),
+ (0x10, '3'),
+ (0x11, '3'),
+ (0x12, '3'),
+ (0x13, '3'),
+ (0x14, '3'),
+ (0x15, '3'),
+ (0x16, '3'),
+ (0x17, '3'),
+ (0x18, '3'),
+ (0x19, '3'),
+ (0x1A, '3'),
+ (0x1B, '3'),
+ (0x1C, '3'),
+ (0x1D, '3'),
+ (0x1E, '3'),
+ (0x1F, '3'),
+ (0x20, '3'),
+ (0x21, '3'),
+ (0x22, '3'),
+ (0x23, '3'),
+ (0x24, '3'),
+ (0x25, '3'),
+ (0x26, '3'),
+ (0x27, '3'),
+ (0x28, '3'),
+ (0x29, '3'),
+ (0x2A, '3'),
+ (0x2B, '3'),
+ (0x2C, '3'),
+ (0x2D, 'V'),
+ (0x2E, 'V'),
+ (0x2F, '3'),
+ (0x30, 'V'),
+ (0x31, 'V'),
+ (0x32, 'V'),
+ (0x33, 'V'),
+ (0x34, 'V'),
+ (0x35, 'V'),
+ (0x36, 'V'),
+ (0x37, 'V'),
+ (0x38, 'V'),
+ (0x39, 'V'),
+ (0x3A, '3'),
+ (0x3B, '3'),
+ (0x3C, '3'),
+ (0x3D, '3'),
+ (0x3E, '3'),
+ (0x3F, '3'),
+ (0x40, '3'),
+ (0x41, 'M', u'a'),
+ (0x42, 'M', u'b'),
+ (0x43, 'M', u'c'),
+ (0x44, 'M', u'd'),
+ (0x45, 'M', u'e'),
+ (0x46, 'M', u'f'),
+ (0x47, 'M', u'g'),
+ (0x48, 'M', u'h'),
+ (0x49, 'M', u'i'),
+ (0x4A, 'M', u'j'),
+ (0x4B, 'M', u'k'),
+ (0x4C, 'M', u'l'),
+ (0x4D, 'M', u'm'),
+ (0x4E, 'M', u'n'),
+ (0x4F, 'M', u'o'),
+ (0x50, 'M', u'p'),
+ (0x51, 'M', u'q'),
+ (0x52, 'M', u'r'),
+ (0x53, 'M', u's'),
+ (0x54, 'M', u't'),
+ (0x55, 'M', u'u'),
+ (0x56, 'M', u'v'),
+ (0x57, 'M', u'w'),
+ (0x58, 'M', u'x'),
+ (0x59, 'M', u'y'),
+ (0x5A, 'M', u'z'),
+ (0x5B, '3'),
+ (0x5C, '3'),
+ (0x5D, '3'),
+ (0x5E, '3'),
+ (0x5F, '3'),
+ (0x60, '3'),
+ (0x61, 'V'),
+ (0x62, 'V'),
+ (0x63, 'V'),
+ ]
+
+def _seg_1():
+ return [
+ (0x64, 'V'),
+ (0x65, 'V'),
+ (0x66, 'V'),
+ (0x67, 'V'),
+ (0x68, 'V'),
+ (0x69, 'V'),
+ (0x6A, 'V'),
+ (0x6B, 'V'),
+ (0x6C, 'V'),
+ (0x6D, 'V'),
+ (0x6E, 'V'),
+ (0x6F, 'V'),
+ (0x70, 'V'),
+ (0x71, 'V'),
+ (0x72, 'V'),
+ (0x73, 'V'),
+ (0x74, 'V'),
+ (0x75, 'V'),
+ (0x76, 'V'),
+ (0x77, 'V'),
+ (0x78, 'V'),
+ (0x79, 'V'),
+ (0x7A, 'V'),
+ (0x7B, '3'),
+ (0x7C, '3'),
+ (0x7D, '3'),
+ (0x7E, '3'),
+ (0x7F, '3'),
+ (0x80, 'X'),
+ (0x81, 'X'),
+ (0x82, 'X'),
+ (0x83, 'X'),
+ (0x84, 'X'),
+ (0x85, 'X'),
+ (0x86, 'X'),
+ (0x87, 'X'),
+ (0x88, 'X'),
+ (0x89, 'X'),
+ (0x8A, 'X'),
+ (0x8B, 'X'),
+ (0x8C, 'X'),
+ (0x8D, 'X'),
+ (0x8E, 'X'),
+ (0x8F, 'X'),
+ (0x90, 'X'),
+ (0x91, 'X'),
+ (0x92, 'X'),
+ (0x93, 'X'),
+ (0x94, 'X'),
+ (0x95, 'X'),
+ (0x96, 'X'),
+ (0x97, 'X'),
+ (0x98, 'X'),
+ (0x99, 'X'),
+ (0x9A, 'X'),
+ (0x9B, 'X'),
+ (0x9C, 'X'),
+ (0x9D, 'X'),
+ (0x9E, 'X'),
+ (0x9F, 'X'),
+ (0xA0, '3', u' '),
+ (0xA1, 'V'),
+ (0xA2, 'V'),
+ (0xA3, 'V'),
+ (0xA4, 'V'),
+ (0xA5, 'V'),
+ (0xA6, 'V'),
+ (0xA7, 'V'),
+ (0xA8, '3', u' ̈'),
+ (0xA9, 'V'),
+ (0xAA, 'M', u'a'),
+ (0xAB, 'V'),
+ (0xAC, 'V'),
+ (0xAD, 'I'),
+ (0xAE, 'V'),
+ (0xAF, '3', u' ̄'),
+ (0xB0, 'V'),
+ (0xB1, 'V'),
+ (0xB2, 'M', u'2'),
+ (0xB3, 'M', u'3'),
+ (0xB4, '3', u' ́'),
+ (0xB5, 'M', u'μ'),
+ (0xB6, 'V'),
+ (0xB7, 'V'),
+ (0xB8, '3', u' ̧'),
+ (0xB9, 'M', u'1'),
+ (0xBA, 'M', u'o'),
+ (0xBB, 'V'),
+ (0xBC, 'M', u'1⁄4'),
+ (0xBD, 'M', u'1⁄2'),
+ (0xBE, 'M', u'3⁄4'),
+ (0xBF, 'V'),
+ (0xC0, 'M', u'à'),
+ (0xC1, 'M', u'á'),
+ (0xC2, 'M', u'â'),
+ (0xC3, 'M', u'ã'),
+ (0xC4, 'M', u'ä'),
+ (0xC5, 'M', u'å'),
+ (0xC6, 'M', u'æ'),
+ (0xC7, 'M', u'ç'),
+ ]
+
+def _seg_2():
+ return [
+ (0xC8, 'M', u'è'),
+ (0xC9, 'M', u'é'),
+ (0xCA, 'M', u'ê'),
+ (0xCB, 'M', u'ë'),
+ (0xCC, 'M', u'ì'),
+ (0xCD, 'M', u'í'),
+ (0xCE, 'M', u'î'),
+ (0xCF, 'M', u'ï'),
+ (0xD0, 'M', u'ð'),
+ (0xD1, 'M', u'ñ'),
+ (0xD2, 'M', u'ò'),
+ (0xD3, 'M', u'ó'),
+ (0xD4, 'M', u'ô'),
+ (0xD5, 'M', u'õ'),
+ (0xD6, 'M', u'ö'),
+ (0xD7, 'V'),
+ (0xD8, 'M', u'ø'),
+ (0xD9, 'M', u'ù'),
+ (0xDA, 'M', u'ú'),
+ (0xDB, 'M', u'û'),
+ (0xDC, 'M', u'ü'),
+ (0xDD, 'M', u'ý'),
+ (0xDE, 'M', u'þ'),
+ (0xDF, 'D', u'ss'),
+ (0xE0, 'V'),
+ (0xE1, 'V'),
+ (0xE2, 'V'),
+ (0xE3, 'V'),
+ (0xE4, 'V'),
+ (0xE5, 'V'),
+ (0xE6, 'V'),
+ (0xE7, 'V'),
+ (0xE8, 'V'),
+ (0xE9, 'V'),
+ (0xEA, 'V'),
+ (0xEB, 'V'),
+ (0xEC, 'V'),
+ (0xED, 'V'),
+ (0xEE, 'V'),
+ (0xEF, 'V'),
+ (0xF0, 'V'),
+ (0xF1, 'V'),
+ (0xF2, 'V'),
+ (0xF3, 'V'),
+ (0xF4, 'V'),
+ (0xF5, 'V'),
+ (0xF6, 'V'),
+ (0xF7, 'V'),
+ (0xF8, 'V'),
+ (0xF9, 'V'),
+ (0xFA, 'V'),
+ (0xFB, 'V'),
+ (0xFC, 'V'),
+ (0xFD, 'V'),
+ (0xFE, 'V'),
+ (0xFF, 'V'),
+ (0x100, 'M', u'ā'),
+ (0x101, 'V'),
+ (0x102, 'M', u'ă'),
+ (0x103, 'V'),
+ (0x104, 'M', u'ą'),
+ (0x105, 'V'),
+ (0x106, 'M', u'ć'),
+ (0x107, 'V'),
+ (0x108, 'M', u'ĉ'),
+ (0x109, 'V'),
+ (0x10A, 'M', u'ċ'),
+ (0x10B, 'V'),
+ (0x10C, 'M', u'č'),
+ (0x10D, 'V'),
+ (0x10E, 'M', u'ď'),
+ (0x10F, 'V'),
+ (0x110, 'M', u'đ'),
+ (0x111, 'V'),
+ (0x112, 'M', u'ē'),
+ (0x113, 'V'),
+ (0x114, 'M', u'ĕ'),
+ (0x115, 'V'),
+ (0x116, 'M', u'ė'),
+ (0x117, 'V'),
+ (0x118, 'M', u'ę'),
+ (0x119, 'V'),
+ (0x11A, 'M', u'ě'),
+ (0x11B, 'V'),
+ (0x11C, 'M', u'ĝ'),
+ (0x11D, 'V'),
+ (0x11E, 'M', u'ğ'),
+ (0x11F, 'V'),
+ (0x120, 'M', u'ġ'),
+ (0x121, 'V'),
+ (0x122, 'M', u'ģ'),
+ (0x123, 'V'),
+ (0x124, 'M', u'ĥ'),
+ (0x125, 'V'),
+ (0x126, 'M', u'ħ'),
+ (0x127, 'V'),
+ (0x128, 'M', u'ĩ'),
+ (0x129, 'V'),
+ (0x12A, 'M', u'ī'),
+ (0x12B, 'V'),
+ ]
+
+def _seg_3():
+ return [
+ (0x12C, 'M', u'ĭ'),
+ (0x12D, 'V'),
+ (0x12E, 'M', u'į'),
+ (0x12F, 'V'),
+ (0x130, 'M', u'i̇'),
+ (0x131, 'V'),
+ (0x132, 'M', u'ij'),
+ (0x134, 'M', u'ĵ'),
+ (0x135, 'V'),
+ (0x136, 'M', u'ķ'),
+ (0x137, 'V'),
+ (0x139, 'M', u'ĺ'),
+ (0x13A, 'V'),
+ (0x13B, 'M', u'ļ'),
+ (0x13C, 'V'),
+ (0x13D, 'M', u'ľ'),
+ (0x13E, 'V'),
+ (0x13F, 'M', u'l·'),
+ (0x141, 'M', u'ł'),
+ (0x142, 'V'),
+ (0x143, 'M', u'ń'),
+ (0x144, 'V'),
+ (0x145, 'M', u'ņ'),
+ (0x146, 'V'),
+ (0x147, 'M', u'ň'),
+ (0x148, 'V'),
+ (0x149, 'M', u'ʼn'),
+ (0x14A, 'M', u'ŋ'),
+ (0x14B, 'V'),
+ (0x14C, 'M', u'ō'),
+ (0x14D, 'V'),
+ (0x14E, 'M', u'ŏ'),
+ (0x14F, 'V'),
+ (0x150, 'M', u'ő'),
+ (0x151, 'V'),
+ (0x152, 'M', u'œ'),
+ (0x153, 'V'),
+ (0x154, 'M', u'ŕ'),
+ (0x155, 'V'),
+ (0x156, 'M', u'ŗ'),
+ (0x157, 'V'),
+ (0x158, 'M', u'ř'),
+ (0x159, 'V'),
+ (0x15A, 'M', u'ś'),
+ (0x15B, 'V'),
+ (0x15C, 'M', u'ŝ'),
+ (0x15D, 'V'),
+ (0x15E, 'M', u'ş'),
+ (0x15F, 'V'),
+ (0x160, 'M', u'š'),
+ (0x161, 'V'),
+ (0x162, 'M', u'ţ'),
+ (0x163, 'V'),
+ (0x164, 'M', u'ť'),
+ (0x165, 'V'),
+ (0x166, 'M', u'ŧ'),
+ (0x167, 'V'),
+ (0x168, 'M', u'ũ'),
+ (0x169, 'V'),
+ (0x16A, 'M', u'ū'),
+ (0x16B, 'V'),
+ (0x16C, 'M', u'ŭ'),
+ (0x16D, 'V'),
+ (0x16E, 'M', u'ů'),
+ (0x16F, 'V'),
+ (0x170, 'M', u'ű'),
+ (0x171, 'V'),
+ (0x172, 'M', u'ų'),
+ (0x173, 'V'),
+ (0x174, 'M', u'ŵ'),
+ (0x175, 'V'),
+ (0x176, 'M', u'ŷ'),
+ (0x177, 'V'),
+ (0x178, 'M', u'ÿ'),
+ (0x179, 'M', u'ź'),
+ (0x17A, 'V'),
+ (0x17B, 'M', u'ż'),
+ (0x17C, 'V'),
+ (0x17D, 'M', u'ž'),
+ (0x17E, 'V'),
+ (0x17F, 'M', u's'),
+ (0x180, 'V'),
+ (0x181, 'M', u'ɓ'),
+ (0x182, 'M', u'ƃ'),
+ (0x183, 'V'),
+ (0x184, 'M', u'ƅ'),
+ (0x185, 'V'),
+ (0x186, 'M', u'ɔ'),
+ (0x187, 'M', u'ƈ'),
+ (0x188, 'V'),
+ (0x189, 'M', u'ɖ'),
+ (0x18A, 'M', u'ɗ'),
+ (0x18B, 'M', u'ƌ'),
+ (0x18C, 'V'),
+ (0x18E, 'M', u'ǝ'),
+ (0x18F, 'M', u'ə'),
+ (0x190, 'M', u'ɛ'),
+ (0x191, 'M', u'ƒ'),
+ (0x192, 'V'),
+ (0x193, 'M', u'ɠ'),
+ ]
+
+def _seg_4():
+ return [
+ (0x194, 'M', u'ɣ'),
+ (0x195, 'V'),
+ (0x196, 'M', u'ɩ'),
+ (0x197, 'M', u'ɨ'),
+ (0x198, 'M', u'ƙ'),
+ (0x199, 'V'),
+ (0x19C, 'M', u'ɯ'),
+ (0x19D, 'M', u'ɲ'),
+ (0x19E, 'V'),
+ (0x19F, 'M', u'ɵ'),
+ (0x1A0, 'M', u'ơ'),
+ (0x1A1, 'V'),
+ (0x1A2, 'M', u'ƣ'),
+ (0x1A3, 'V'),
+ (0x1A4, 'M', u'ƥ'),
+ (0x1A5, 'V'),
+ (0x1A6, 'M', u'ʀ'),
+ (0x1A7, 'M', u'ƨ'),
+ (0x1A8, 'V'),
+ (0x1A9, 'M', u'ʃ'),
+ (0x1AA, 'V'),
+ (0x1AC, 'M', u'ƭ'),
+ (0x1AD, 'V'),
+ (0x1AE, 'M', u'ʈ'),
+ (0x1AF, 'M', u'ư'),
+ (0x1B0, 'V'),
+ (0x1B1, 'M', u'ʊ'),
+ (0x1B2, 'M', u'ʋ'),
+ (0x1B3, 'M', u'ƴ'),
+ (0x1B4, 'V'),
+ (0x1B5, 'M', u'ƶ'),
+ (0x1B6, 'V'),
+ (0x1B7, 'M', u'ʒ'),
+ (0x1B8, 'M', u'ƹ'),
+ (0x1B9, 'V'),
+ (0x1BC, 'M', u'ƽ'),
+ (0x1BD, 'V'),
+ (0x1C4, 'M', u'dž'),
+ (0x1C7, 'M', u'lj'),
+ (0x1CA, 'M', u'nj'),
+ (0x1CD, 'M', u'ǎ'),
+ (0x1CE, 'V'),
+ (0x1CF, 'M', u'ǐ'),
+ (0x1D0, 'V'),
+ (0x1D1, 'M', u'ǒ'),
+ (0x1D2, 'V'),
+ (0x1D3, 'M', u'ǔ'),
+ (0x1D4, 'V'),
+ (0x1D5, 'M', u'ǖ'),
+ (0x1D6, 'V'),
+ (0x1D7, 'M', u'ǘ'),
+ (0x1D8, 'V'),
+ (0x1D9, 'M', u'ǚ'),
+ (0x1DA, 'V'),
+ (0x1DB, 'M', u'ǜ'),
+ (0x1DC, 'V'),
+ (0x1DE, 'M', u'ǟ'),
+ (0x1DF, 'V'),
+ (0x1E0, 'M', u'ǡ'),
+ (0x1E1, 'V'),
+ (0x1E2, 'M', u'ǣ'),
+ (0x1E3, 'V'),
+ (0x1E4, 'M', u'ǥ'),
+ (0x1E5, 'V'),
+ (0x1E6, 'M', u'ǧ'),
+ (0x1E7, 'V'),
+ (0x1E8, 'M', u'ǩ'),
+ (0x1E9, 'V'),
+ (0x1EA, 'M', u'ǫ'),
+ (0x1EB, 'V'),
+ (0x1EC, 'M', u'ǭ'),
+ (0x1ED, 'V'),
+ (0x1EE, 'M', u'ǯ'),
+ (0x1EF, 'V'),
+ (0x1F1, 'M', u'dz'),
+ (0x1F4, 'M', u'ǵ'),
+ (0x1F5, 'V'),
+ (0x1F6, 'M', u'ƕ'),
+ (0x1F7, 'M', u'ƿ'),
+ (0x1F8, 'M', u'ǹ'),
+ (0x1F9, 'V'),
+ (0x1FA, 'M', u'ǻ'),
+ (0x1FB, 'V'),
+ (0x1FC, 'M', u'ǽ'),
+ (0x1FD, 'V'),
+ (0x1FE, 'M', u'ǿ'),
+ (0x1FF, 'V'),
+ (0x200, 'M', u'ȁ'),
+ (0x201, 'V'),
+ (0x202, 'M', u'ȃ'),
+ (0x203, 'V'),
+ (0x204, 'M', u'ȅ'),
+ (0x205, 'V'),
+ (0x206, 'M', u'ȇ'),
+ (0x207, 'V'),
+ (0x208, 'M', u'ȉ'),
+ (0x209, 'V'),
+ (0x20A, 'M', u'ȋ'),
+ (0x20B, 'V'),
+ (0x20C, 'M', u'ȍ'),
+ ]
+
+def _seg_5():
+ return [
+ (0x20D, 'V'),
+ (0x20E, 'M', u'ȏ'),
+ (0x20F, 'V'),
+ (0x210, 'M', u'ȑ'),
+ (0x211, 'V'),
+ (0x212, 'M', u'ȓ'),
+ (0x213, 'V'),
+ (0x214, 'M', u'ȕ'),
+ (0x215, 'V'),
+ (0x216, 'M', u'ȗ'),
+ (0x217, 'V'),
+ (0x218, 'M', u'ș'),
+ (0x219, 'V'),
+ (0x21A, 'M', u'ț'),
+ (0x21B, 'V'),
+ (0x21C, 'M', u'ȝ'),
+ (0x21D, 'V'),
+ (0x21E, 'M', u'ȟ'),
+ (0x21F, 'V'),
+ (0x220, 'M', u'ƞ'),
+ (0x221, 'V'),
+ (0x222, 'M', u'ȣ'),
+ (0x223, 'V'),
+ (0x224, 'M', u'ȥ'),
+ (0x225, 'V'),
+ (0x226, 'M', u'ȧ'),
+ (0x227, 'V'),
+ (0x228, 'M', u'ȩ'),
+ (0x229, 'V'),
+ (0x22A, 'M', u'ȫ'),
+ (0x22B, 'V'),
+ (0x22C, 'M', u'ȭ'),
+ (0x22D, 'V'),
+ (0x22E, 'M', u'ȯ'),
+ (0x22F, 'V'),
+ (0x230, 'M', u'ȱ'),
+ (0x231, 'V'),
+ (0x232, 'M', u'ȳ'),
+ (0x233, 'V'),
+ (0x23A, 'M', u'ⱥ'),
+ (0x23B, 'M', u'ȼ'),
+ (0x23C, 'V'),
+ (0x23D, 'M', u'ƚ'),
+ (0x23E, 'M', u'ⱦ'),
+ (0x23F, 'V'),
+ (0x241, 'M', u'ɂ'),
+ (0x242, 'V'),
+ (0x243, 'M', u'ƀ'),
+ (0x244, 'M', u'ʉ'),
+ (0x245, 'M', u'ʌ'),
+ (0x246, 'M', u'ɇ'),
+ (0x247, 'V'),
+ (0x248, 'M', u'ɉ'),
+ (0x249, 'V'),
+ (0x24A, 'M', u'ɋ'),
+ (0x24B, 'V'),
+ (0x24C, 'M', u'ɍ'),
+ (0x24D, 'V'),
+ (0x24E, 'M', u'ɏ'),
+ (0x24F, 'V'),
+ (0x2B0, 'M', u'h'),
+ (0x2B1, 'M', u'ɦ'),
+ (0x2B2, 'M', u'j'),
+ (0x2B3, 'M', u'r'),
+ (0x2B4, 'M', u'ɹ'),
+ (0x2B5, 'M', u'ɻ'),
+ (0x2B6, 'M', u'ʁ'),
+ (0x2B7, 'M', u'w'),
+ (0x2B8, 'M', u'y'),
+ (0x2B9, 'V'),
+ (0x2D8, '3', u' ̆'),
+ (0x2D9, '3', u' ̇'),
+ (0x2DA, '3', u' ̊'),
+ (0x2DB, '3', u' ̨'),
+ (0x2DC, '3', u' ̃'),
+ (0x2DD, '3', u' ̋'),
+ (0x2DE, 'V'),
+ (0x2E0, 'M', u'ɣ'),
+ (0x2E1, 'M', u'l'),
+ (0x2E2, 'M', u's'),
+ (0x2E3, 'M', u'x'),
+ (0x2E4, 'M', u'ʕ'),
+ (0x2E5, 'V'),
+ (0x340, 'M', u'̀'),
+ (0x341, 'M', u'́'),
+ (0x342, 'V'),
+ (0x343, 'M', u'̓'),
+ (0x344, 'M', u'̈́'),
+ (0x345, 'M', u'ι'),
+ (0x346, 'V'),
+ (0x34F, 'I'),
+ (0x350, 'V'),
+ (0x370, 'M', u'ͱ'),
+ (0x371, 'V'),
+ (0x372, 'M', u'ͳ'),
+ (0x373, 'V'),
+ (0x374, 'M', u'ʹ'),
+ (0x375, 'V'),
+ (0x376, 'M', u'ͷ'),
+ (0x377, 'V'),
+ ]
+
+def _seg_6():
+ return [
+ (0x378, 'X'),
+ (0x37A, '3', u' ι'),
+ (0x37B, 'V'),
+ (0x37E, '3', u';'),
+ (0x37F, 'M', u'ϳ'),
+ (0x380, 'X'),
+ (0x384, '3', u' ́'),
+ (0x385, '3', u' ̈́'),
+ (0x386, 'M', u'ά'),
+ (0x387, 'M', u'·'),
+ (0x388, 'M', u'έ'),
+ (0x389, 'M', u'ή'),
+ (0x38A, 'M', u'ί'),
+ (0x38B, 'X'),
+ (0x38C, 'M', u'ό'),
+ (0x38D, 'X'),
+ (0x38E, 'M', u'ύ'),
+ (0x38F, 'M', u'ώ'),
+ (0x390, 'V'),
+ (0x391, 'M', u'α'),
+ (0x392, 'M', u'β'),
+ (0x393, 'M', u'γ'),
+ (0x394, 'M', u'δ'),
+ (0x395, 'M', u'ε'),
+ (0x396, 'M', u'ζ'),
+ (0x397, 'M', u'η'),
+ (0x398, 'M', u'θ'),
+ (0x399, 'M', u'ι'),
+ (0x39A, 'M', u'κ'),
+ (0x39B, 'M', u'λ'),
+ (0x39C, 'M', u'μ'),
+ (0x39D, 'M', u'ν'),
+ (0x39E, 'M', u'ξ'),
+ (0x39F, 'M', u'ο'),
+ (0x3A0, 'M', u'π'),
+ (0x3A1, 'M', u'ρ'),
+ (0x3A2, 'X'),
+ (0x3A3, 'M', u'σ'),
+ (0x3A4, 'M', u'τ'),
+ (0x3A5, 'M', u'υ'),
+ (0x3A6, 'M', u'φ'),
+ (0x3A7, 'M', u'χ'),
+ (0x3A8, 'M', u'ψ'),
+ (0x3A9, 'M', u'ω'),
+ (0x3AA, 'M', u'ϊ'),
+ (0x3AB, 'M', u'ϋ'),
+ (0x3AC, 'V'),
+ (0x3C2, 'D', u'σ'),
+ (0x3C3, 'V'),
+ (0x3CF, 'M', u'ϗ'),
+ (0x3D0, 'M', u'β'),
+ (0x3D1, 'M', u'θ'),
+ (0x3D2, 'M', u'υ'),
+ (0x3D3, 'M', u'ύ'),
+ (0x3D4, 'M', u'ϋ'),
+ (0x3D5, 'M', u'φ'),
+ (0x3D6, 'M', u'π'),
+ (0x3D7, 'V'),
+ (0x3D8, 'M', u'ϙ'),
+ (0x3D9, 'V'),
+ (0x3DA, 'M', u'ϛ'),
+ (0x3DB, 'V'),
+ (0x3DC, 'M', u'ϝ'),
+ (0x3DD, 'V'),
+ (0x3DE, 'M', u'ϟ'),
+ (0x3DF, 'V'),
+ (0x3E0, 'M', u'ϡ'),
+ (0x3E1, 'V'),
+ (0x3E2, 'M', u'ϣ'),
+ (0x3E3, 'V'),
+ (0x3E4, 'M', u'ϥ'),
+ (0x3E5, 'V'),
+ (0x3E6, 'M', u'ϧ'),
+ (0x3E7, 'V'),
+ (0x3E8, 'M', u'ϩ'),
+ (0x3E9, 'V'),
+ (0x3EA, 'M', u'ϫ'),
+ (0x3EB, 'V'),
+ (0x3EC, 'M', u'ϭ'),
+ (0x3ED, 'V'),
+ (0x3EE, 'M', u'ϯ'),
+ (0x3EF, 'V'),
+ (0x3F0, 'M', u'κ'),
+ (0x3F1, 'M', u'ρ'),
+ (0x3F2, 'M', u'σ'),
+ (0x3F3, 'V'),
+ (0x3F4, 'M', u'θ'),
+ (0x3F5, 'M', u'ε'),
+ (0x3F6, 'V'),
+ (0x3F7, 'M', u'ϸ'),
+ (0x3F8, 'V'),
+ (0x3F9, 'M', u'σ'),
+ (0x3FA, 'M', u'ϻ'),
+ (0x3FB, 'V'),
+ (0x3FD, 'M', u'ͻ'),
+ (0x3FE, 'M', u'ͼ'),
+ (0x3FF, 'M', u'ͽ'),
+ (0x400, 'M', u'ѐ'),
+ (0x401, 'M', u'ё'),
+ (0x402, 'M', u'ђ'),
+ ]
+
+def _seg_7():
+ return [
+ (0x403, 'M', u'ѓ'),
+ (0x404, 'M', u'є'),
+ (0x405, 'M', u'ѕ'),
+ (0x406, 'M', u'і'),
+ (0x407, 'M', u'ї'),
+ (0x408, 'M', u'ј'),
+ (0x409, 'M', u'љ'),
+ (0x40A, 'M', u'њ'),
+ (0x40B, 'M', u'ћ'),
+ (0x40C, 'M', u'ќ'),
+ (0x40D, 'M', u'ѝ'),
+ (0x40E, 'M', u'ў'),
+ (0x40F, 'M', u'џ'),
+ (0x410, 'M', u'а'),
+ (0x411, 'M', u'б'),
+ (0x412, 'M', u'в'),
+ (0x413, 'M', u'г'),
+ (0x414, 'M', u'д'),
+ (0x415, 'M', u'е'),
+ (0x416, 'M', u'ж'),
+ (0x417, 'M', u'з'),
+ (0x418, 'M', u'и'),
+ (0x419, 'M', u'й'),
+ (0x41A, 'M', u'к'),
+ (0x41B, 'M', u'л'),
+ (0x41C, 'M', u'м'),
+ (0x41D, 'M', u'н'),
+ (0x41E, 'M', u'о'),
+ (0x41F, 'M', u'п'),
+ (0x420, 'M', u'р'),
+ (0x421, 'M', u'с'),
+ (0x422, 'M', u'т'),
+ (0x423, 'M', u'у'),
+ (0x424, 'M', u'ф'),
+ (0x425, 'M', u'х'),
+ (0x426, 'M', u'ц'),
+ (0x427, 'M', u'ч'),
+ (0x428, 'M', u'ш'),
+ (0x429, 'M', u'щ'),
+ (0x42A, 'M', u'ъ'),
+ (0x42B, 'M', u'ы'),
+ (0x42C, 'M', u'ь'),
+ (0x42D, 'M', u'э'),
+ (0x42E, 'M', u'ю'),
+ (0x42F, 'M', u'я'),
+ (0x430, 'V'),
+ (0x460, 'M', u'ѡ'),
+ (0x461, 'V'),
+ (0x462, 'M', u'ѣ'),
+ (0x463, 'V'),
+ (0x464, 'M', u'ѥ'),
+ (0x465, 'V'),
+ (0x466, 'M', u'ѧ'),
+ (0x467, 'V'),
+ (0x468, 'M', u'ѩ'),
+ (0x469, 'V'),
+ (0x46A, 'M', u'ѫ'),
+ (0x46B, 'V'),
+ (0x46C, 'M', u'ѭ'),
+ (0x46D, 'V'),
+ (0x46E, 'M', u'ѯ'),
+ (0x46F, 'V'),
+ (0x470, 'M', u'ѱ'),
+ (0x471, 'V'),
+ (0x472, 'M', u'ѳ'),
+ (0x473, 'V'),
+ (0x474, 'M', u'ѵ'),
+ (0x475, 'V'),
+ (0x476, 'M', u'ѷ'),
+ (0x477, 'V'),
+ (0x478, 'M', u'ѹ'),
+ (0x479, 'V'),
+ (0x47A, 'M', u'ѻ'),
+ (0x47B, 'V'),
+ (0x47C, 'M', u'ѽ'),
+ (0x47D, 'V'),
+ (0x47E, 'M', u'ѿ'),
+ (0x47F, 'V'),
+ (0x480, 'M', u'ҁ'),
+ (0x481, 'V'),
+ (0x48A, 'M', u'ҋ'),
+ (0x48B, 'V'),
+ (0x48C, 'M', u'ҍ'),
+ (0x48D, 'V'),
+ (0x48E, 'M', u'ҏ'),
+ (0x48F, 'V'),
+ (0x490, 'M', u'ґ'),
+ (0x491, 'V'),
+ (0x492, 'M', u'ғ'),
+ (0x493, 'V'),
+ (0x494, 'M', u'ҕ'),
+ (0x495, 'V'),
+ (0x496, 'M', u'җ'),
+ (0x497, 'V'),
+ (0x498, 'M', u'ҙ'),
+ (0x499, 'V'),
+ (0x49A, 'M', u'қ'),
+ (0x49B, 'V'),
+ (0x49C, 'M', u'ҝ'),
+ (0x49D, 'V'),
+ ]
+
+def _seg_8():
+ return [
+ (0x49E, 'M', u'ҟ'),
+ (0x49F, 'V'),
+ (0x4A0, 'M', u'ҡ'),
+ (0x4A1, 'V'),
+ (0x4A2, 'M', u'ң'),
+ (0x4A3, 'V'),
+ (0x4A4, 'M', u'ҥ'),
+ (0x4A5, 'V'),
+ (0x4A6, 'M', u'ҧ'),
+ (0x4A7, 'V'),
+ (0x4A8, 'M', u'ҩ'),
+ (0x4A9, 'V'),
+ (0x4AA, 'M', u'ҫ'),
+ (0x4AB, 'V'),
+ (0x4AC, 'M', u'ҭ'),
+ (0x4AD, 'V'),
+ (0x4AE, 'M', u'ү'),
+ (0x4AF, 'V'),
+ (0x4B0, 'M', u'ұ'),
+ (0x4B1, 'V'),
+ (0x4B2, 'M', u'ҳ'),
+ (0x4B3, 'V'),
+ (0x4B4, 'M', u'ҵ'),
+ (0x4B5, 'V'),
+ (0x4B6, 'M', u'ҷ'),
+ (0x4B7, 'V'),
+ (0x4B8, 'M', u'ҹ'),
+ (0x4B9, 'V'),
+ (0x4BA, 'M', u'һ'),
+ (0x4BB, 'V'),
+ (0x4BC, 'M', u'ҽ'),
+ (0x4BD, 'V'),
+ (0x4BE, 'M', u'ҿ'),
+ (0x4BF, 'V'),
+ (0x4C0, 'X'),
+ (0x4C1, 'M', u'ӂ'),
+ (0x4C2, 'V'),
+ (0x4C3, 'M', u'ӄ'),
+ (0x4C4, 'V'),
+ (0x4C5, 'M', u'ӆ'),
+ (0x4C6, 'V'),
+ (0x4C7, 'M', u'ӈ'),
+ (0x4C8, 'V'),
+ (0x4C9, 'M', u'ӊ'),
+ (0x4CA, 'V'),
+ (0x4CB, 'M', u'ӌ'),
+ (0x4CC, 'V'),
+ (0x4CD, 'M', u'ӎ'),
+ (0x4CE, 'V'),
+ (0x4D0, 'M', u'ӑ'),
+ (0x4D1, 'V'),
+ (0x4D2, 'M', u'ӓ'),
+ (0x4D3, 'V'),
+ (0x4D4, 'M', u'ӕ'),
+ (0x4D5, 'V'),
+ (0x4D6, 'M', u'ӗ'),
+ (0x4D7, 'V'),
+ (0x4D8, 'M', u'ә'),
+ (0x4D9, 'V'),
+ (0x4DA, 'M', u'ӛ'),
+ (0x4DB, 'V'),
+ (0x4DC, 'M', u'ӝ'),
+ (0x4DD, 'V'),
+ (0x4DE, 'M', u'ӟ'),
+ (0x4DF, 'V'),
+ (0x4E0, 'M', u'ӡ'),
+ (0x4E1, 'V'),
+ (0x4E2, 'M', u'ӣ'),
+ (0x4E3, 'V'),
+ (0x4E4, 'M', u'ӥ'),
+ (0x4E5, 'V'),
+ (0x4E6, 'M', u'ӧ'),
+ (0x4E7, 'V'),
+ (0x4E8, 'M', u'ө'),
+ (0x4E9, 'V'),
+ (0x4EA, 'M', u'ӫ'),
+ (0x4EB, 'V'),
+ (0x4EC, 'M', u'ӭ'),
+ (0x4ED, 'V'),
+ (0x4EE, 'M', u'ӯ'),
+ (0x4EF, 'V'),
+ (0x4F0, 'M', u'ӱ'),
+ (0x4F1, 'V'),
+ (0x4F2, 'M', u'ӳ'),
+ (0x4F3, 'V'),
+ (0x4F4, 'M', u'ӵ'),
+ (0x4F5, 'V'),
+ (0x4F6, 'M', u'ӷ'),
+ (0x4F7, 'V'),
+ (0x4F8, 'M', u'ӹ'),
+ (0x4F9, 'V'),
+ (0x4FA, 'M', u'ӻ'),
+ (0x4FB, 'V'),
+ (0x4FC, 'M', u'ӽ'),
+ (0x4FD, 'V'),
+ (0x4FE, 'M', u'ӿ'),
+ (0x4FF, 'V'),
+ (0x500, 'M', u'ԁ'),
+ (0x501, 'V'),
+ (0x502, 'M', u'ԃ'),
+ ]
+
+def _seg_9():
+ return [
+ (0x503, 'V'),
+ (0x504, 'M', u'ԅ'),
+ (0x505, 'V'),
+ (0x506, 'M', u'ԇ'),
+ (0x507, 'V'),
+ (0x508, 'M', u'ԉ'),
+ (0x509, 'V'),
+ (0x50A, 'M', u'ԋ'),
+ (0x50B, 'V'),
+ (0x50C, 'M', u'ԍ'),
+ (0x50D, 'V'),
+ (0x50E, 'M', u'ԏ'),
+ (0x50F, 'V'),
+ (0x510, 'M', u'ԑ'),
+ (0x511, 'V'),
+ (0x512, 'M', u'ԓ'),
+ (0x513, 'V'),
+ (0x514, 'M', u'ԕ'),
+ (0x515, 'V'),
+ (0x516, 'M', u'ԗ'),
+ (0x517, 'V'),
+ (0x518, 'M', u'ԙ'),
+ (0x519, 'V'),
+ (0x51A, 'M', u'ԛ'),
+ (0x51B, 'V'),
+ (0x51C, 'M', u'ԝ'),
+ (0x51D, 'V'),
+ (0x51E, 'M', u'ԟ'),
+ (0x51F, 'V'),
+ (0x520, 'M', u'ԡ'),
+ (0x521, 'V'),
+ (0x522, 'M', u'ԣ'),
+ (0x523, 'V'),
+ (0x524, 'M', u'ԥ'),
+ (0x525, 'V'),
+ (0x526, 'M', u'ԧ'),
+ (0x527, 'V'),
+ (0x528, 'M', u'ԩ'),
+ (0x529, 'V'),
+ (0x52A, 'M', u'ԫ'),
+ (0x52B, 'V'),
+ (0x52C, 'M', u'ԭ'),
+ (0x52D, 'V'),
+ (0x52E, 'M', u'ԯ'),
+ (0x52F, 'V'),
+ (0x530, 'X'),
+ (0x531, 'M', u'ա'),
+ (0x532, 'M', u'բ'),
+ (0x533, 'M', u'գ'),
+ (0x534, 'M', u'դ'),
+ (0x535, 'M', u'ե'),
+ (0x536, 'M', u'զ'),
+ (0x537, 'M', u'է'),
+ (0x538, 'M', u'ը'),
+ (0x539, 'M', u'թ'),
+ (0x53A, 'M', u'ժ'),
+ (0x53B, 'M', u'ի'),
+ (0x53C, 'M', u'լ'),
+ (0x53D, 'M', u'խ'),
+ (0x53E, 'M', u'ծ'),
+ (0x53F, 'M', u'կ'),
+ (0x540, 'M', u'հ'),
+ (0x541, 'M', u'ձ'),
+ (0x542, 'M', u'ղ'),
+ (0x543, 'M', u'ճ'),
+ (0x544, 'M', u'մ'),
+ (0x545, 'M', u'յ'),
+ (0x546, 'M', u'ն'),
+ (0x547, 'M', u'շ'),
+ (0x548, 'M', u'ո'),
+ (0x549, 'M', u'չ'),
+ (0x54A, 'M', u'պ'),
+ (0x54B, 'M', u'ջ'),
+ (0x54C, 'M', u'ռ'),
+ (0x54D, 'M', u'ս'),
+ (0x54E, 'M', u'վ'),
+ (0x54F, 'M', u'տ'),
+ (0x550, 'M', u'ր'),
+ (0x551, 'M', u'ց'),
+ (0x552, 'M', u'ւ'),
+ (0x553, 'M', u'փ'),
+ (0x554, 'M', u'ք'),
+ (0x555, 'M', u'օ'),
+ (0x556, 'M', u'ֆ'),
+ (0x557, 'X'),
+ (0x559, 'V'),
+ (0x587, 'M', u'եւ'),
+ (0x588, 'V'),
+ (0x58B, 'X'),
+ (0x58D, 'V'),
+ (0x590, 'X'),
+ (0x591, 'V'),
+ (0x5C8, 'X'),
+ (0x5D0, 'V'),
+ (0x5EB, 'X'),
+ (0x5EF, 'V'),
+ (0x5F5, 'X'),
+ (0x606, 'V'),
+ (0x61C, 'X'),
+ (0x61E, 'V'),
+ ]
+
+def _seg_10():
+ return [
+ (0x675, 'M', u'اٴ'),
+ (0x676, 'M', u'وٴ'),
+ (0x677, 'M', u'ۇٴ'),
+ (0x678, 'M', u'يٴ'),
+ (0x679, 'V'),
+ (0x6DD, 'X'),
+ (0x6DE, 'V'),
+ (0x70E, 'X'),
+ (0x710, 'V'),
+ (0x74B, 'X'),
+ (0x74D, 'V'),
+ (0x7B2, 'X'),
+ (0x7C0, 'V'),
+ (0x7FB, 'X'),
+ (0x7FD, 'V'),
+ (0x82E, 'X'),
+ (0x830, 'V'),
+ (0x83F, 'X'),
+ (0x840, 'V'),
+ (0x85C, 'X'),
+ (0x85E, 'V'),
+ (0x85F, 'X'),
+ (0x860, 'V'),
+ (0x86B, 'X'),
+ (0x8A0, 'V'),
+ (0x8B5, 'X'),
+ (0x8B6, 'V'),
+ (0x8C8, 'X'),
+ (0x8D3, 'V'),
+ (0x8E2, 'X'),
+ (0x8E3, 'V'),
+ (0x958, 'M', u'क़'),
+ (0x959, 'M', u'ख़'),
+ (0x95A, 'M', u'ग़'),
+ (0x95B, 'M', u'ज़'),
+ (0x95C, 'M', u'ड़'),
+ (0x95D, 'M', u'ढ़'),
+ (0x95E, 'M', u'फ़'),
+ (0x95F, 'M', u'य़'),
+ (0x960, 'V'),
+ (0x984, 'X'),
+ (0x985, 'V'),
+ (0x98D, 'X'),
+ (0x98F, 'V'),
+ (0x991, 'X'),
+ (0x993, 'V'),
+ (0x9A9, 'X'),
+ (0x9AA, 'V'),
+ (0x9B1, 'X'),
+ (0x9B2, 'V'),
+ (0x9B3, 'X'),
+ (0x9B6, 'V'),
+ (0x9BA, 'X'),
+ (0x9BC, 'V'),
+ (0x9C5, 'X'),
+ (0x9C7, 'V'),
+ (0x9C9, 'X'),
+ (0x9CB, 'V'),
+ (0x9CF, 'X'),
+ (0x9D7, 'V'),
+ (0x9D8, 'X'),
+ (0x9DC, 'M', u'ড়'),
+ (0x9DD, 'M', u'ঢ়'),
+ (0x9DE, 'X'),
+ (0x9DF, 'M', u'য়'),
+ (0x9E0, 'V'),
+ (0x9E4, 'X'),
+ (0x9E6, 'V'),
+ (0x9FF, 'X'),
+ (0xA01, 'V'),
+ (0xA04, 'X'),
+ (0xA05, 'V'),
+ (0xA0B, 'X'),
+ (0xA0F, 'V'),
+ (0xA11, 'X'),
+ (0xA13, 'V'),
+ (0xA29, 'X'),
+ (0xA2A, 'V'),
+ (0xA31, 'X'),
+ (0xA32, 'V'),
+ (0xA33, 'M', u'ਲ਼'),
+ (0xA34, 'X'),
+ (0xA35, 'V'),
+ (0xA36, 'M', u'ਸ਼'),
+ (0xA37, 'X'),
+ (0xA38, 'V'),
+ (0xA3A, 'X'),
+ (0xA3C, 'V'),
+ (0xA3D, 'X'),
+ (0xA3E, 'V'),
+ (0xA43, 'X'),
+ (0xA47, 'V'),
+ (0xA49, 'X'),
+ (0xA4B, 'V'),
+ (0xA4E, 'X'),
+ (0xA51, 'V'),
+ (0xA52, 'X'),
+ (0xA59, 'M', u'ਖ਼'),
+ (0xA5A, 'M', u'ਗ਼'),
+ (0xA5B, 'M', u'ਜ਼'),
+ ]
+
+def _seg_11():
+ return [
+ (0xA5C, 'V'),
+ (0xA5D, 'X'),
+ (0xA5E, 'M', u'ਫ਼'),
+ (0xA5F, 'X'),
+ (0xA66, 'V'),
+ (0xA77, 'X'),
+ (0xA81, 'V'),
+ (0xA84, 'X'),
+ (0xA85, 'V'),
+ (0xA8E, 'X'),
+ (0xA8F, 'V'),
+ (0xA92, 'X'),
+ (0xA93, 'V'),
+ (0xAA9, 'X'),
+ (0xAAA, 'V'),
+ (0xAB1, 'X'),
+ (0xAB2, 'V'),
+ (0xAB4, 'X'),
+ (0xAB5, 'V'),
+ (0xABA, 'X'),
+ (0xABC, 'V'),
+ (0xAC6, 'X'),
+ (0xAC7, 'V'),
+ (0xACA, 'X'),
+ (0xACB, 'V'),
+ (0xACE, 'X'),
+ (0xAD0, 'V'),
+ (0xAD1, 'X'),
+ (0xAE0, 'V'),
+ (0xAE4, 'X'),
+ (0xAE6, 'V'),
+ (0xAF2, 'X'),
+ (0xAF9, 'V'),
+ (0xB00, 'X'),
+ (0xB01, 'V'),
+ (0xB04, 'X'),
+ (0xB05, 'V'),
+ (0xB0D, 'X'),
+ (0xB0F, 'V'),
+ (0xB11, 'X'),
+ (0xB13, 'V'),
+ (0xB29, 'X'),
+ (0xB2A, 'V'),
+ (0xB31, 'X'),
+ (0xB32, 'V'),
+ (0xB34, 'X'),
+ (0xB35, 'V'),
+ (0xB3A, 'X'),
+ (0xB3C, 'V'),
+ (0xB45, 'X'),
+ (0xB47, 'V'),
+ (0xB49, 'X'),
+ (0xB4B, 'V'),
+ (0xB4E, 'X'),
+ (0xB55, 'V'),
+ (0xB58, 'X'),
+ (0xB5C, 'M', u'ଡ଼'),
+ (0xB5D, 'M', u'ଢ଼'),
+ (0xB5E, 'X'),
+ (0xB5F, 'V'),
+ (0xB64, 'X'),
+ (0xB66, 'V'),
+ (0xB78, 'X'),
+ (0xB82, 'V'),
+ (0xB84, 'X'),
+ (0xB85, 'V'),
+ (0xB8B, 'X'),
+ (0xB8E, 'V'),
+ (0xB91, 'X'),
+ (0xB92, 'V'),
+ (0xB96, 'X'),
+ (0xB99, 'V'),
+ (0xB9B, 'X'),
+ (0xB9C, 'V'),
+ (0xB9D, 'X'),
+ (0xB9E, 'V'),
+ (0xBA0, 'X'),
+ (0xBA3, 'V'),
+ (0xBA5, 'X'),
+ (0xBA8, 'V'),
+ (0xBAB, 'X'),
+ (0xBAE, 'V'),
+ (0xBBA, 'X'),
+ (0xBBE, 'V'),
+ (0xBC3, 'X'),
+ (0xBC6, 'V'),
+ (0xBC9, 'X'),
+ (0xBCA, 'V'),
+ (0xBCE, 'X'),
+ (0xBD0, 'V'),
+ (0xBD1, 'X'),
+ (0xBD7, 'V'),
+ (0xBD8, 'X'),
+ (0xBE6, 'V'),
+ (0xBFB, 'X'),
+ (0xC00, 'V'),
+ (0xC0D, 'X'),
+ (0xC0E, 'V'),
+ (0xC11, 'X'),
+ (0xC12, 'V'),
+ ]
+
+def _seg_12():
+ return [
+ (0xC29, 'X'),
+ (0xC2A, 'V'),
+ (0xC3A, 'X'),
+ (0xC3D, 'V'),
+ (0xC45, 'X'),
+ (0xC46, 'V'),
+ (0xC49, 'X'),
+ (0xC4A, 'V'),
+ (0xC4E, 'X'),
+ (0xC55, 'V'),
+ (0xC57, 'X'),
+ (0xC58, 'V'),
+ (0xC5B, 'X'),
+ (0xC60, 'V'),
+ (0xC64, 'X'),
+ (0xC66, 'V'),
+ (0xC70, 'X'),
+ (0xC77, 'V'),
+ (0xC8D, 'X'),
+ (0xC8E, 'V'),
+ (0xC91, 'X'),
+ (0xC92, 'V'),
+ (0xCA9, 'X'),
+ (0xCAA, 'V'),
+ (0xCB4, 'X'),
+ (0xCB5, 'V'),
+ (0xCBA, 'X'),
+ (0xCBC, 'V'),
+ (0xCC5, 'X'),
+ (0xCC6, 'V'),
+ (0xCC9, 'X'),
+ (0xCCA, 'V'),
+ (0xCCE, 'X'),
+ (0xCD5, 'V'),
+ (0xCD7, 'X'),
+ (0xCDE, 'V'),
+ (0xCDF, 'X'),
+ (0xCE0, 'V'),
+ (0xCE4, 'X'),
+ (0xCE6, 'V'),
+ (0xCF0, 'X'),
+ (0xCF1, 'V'),
+ (0xCF3, 'X'),
+ (0xD00, 'V'),
+ (0xD0D, 'X'),
+ (0xD0E, 'V'),
+ (0xD11, 'X'),
+ (0xD12, 'V'),
+ (0xD45, 'X'),
+ (0xD46, 'V'),
+ (0xD49, 'X'),
+ (0xD4A, 'V'),
+ (0xD50, 'X'),
+ (0xD54, 'V'),
+ (0xD64, 'X'),
+ (0xD66, 'V'),
+ (0xD80, 'X'),
+ (0xD81, 'V'),
+ (0xD84, 'X'),
+ (0xD85, 'V'),
+ (0xD97, 'X'),
+ (0xD9A, 'V'),
+ (0xDB2, 'X'),
+ (0xDB3, 'V'),
+ (0xDBC, 'X'),
+ (0xDBD, 'V'),
+ (0xDBE, 'X'),
+ (0xDC0, 'V'),
+ (0xDC7, 'X'),
+ (0xDCA, 'V'),
+ (0xDCB, 'X'),
+ (0xDCF, 'V'),
+ (0xDD5, 'X'),
+ (0xDD6, 'V'),
+ (0xDD7, 'X'),
+ (0xDD8, 'V'),
+ (0xDE0, 'X'),
+ (0xDE6, 'V'),
+ (0xDF0, 'X'),
+ (0xDF2, 'V'),
+ (0xDF5, 'X'),
+ (0xE01, 'V'),
+ (0xE33, 'M', u'ํา'),
+ (0xE34, 'V'),
+ (0xE3B, 'X'),
+ (0xE3F, 'V'),
+ (0xE5C, 'X'),
+ (0xE81, 'V'),
+ (0xE83, 'X'),
+ (0xE84, 'V'),
+ (0xE85, 'X'),
+ (0xE86, 'V'),
+ (0xE8B, 'X'),
+ (0xE8C, 'V'),
+ (0xEA4, 'X'),
+ (0xEA5, 'V'),
+ (0xEA6, 'X'),
+ (0xEA7, 'V'),
+ (0xEB3, 'M', u'ໍາ'),
+ (0xEB4, 'V'),
+ ]
+
+def _seg_13():
+ return [
+ (0xEBE, 'X'),
+ (0xEC0, 'V'),
+ (0xEC5, 'X'),
+ (0xEC6, 'V'),
+ (0xEC7, 'X'),
+ (0xEC8, 'V'),
+ (0xECE, 'X'),
+ (0xED0, 'V'),
+ (0xEDA, 'X'),
+ (0xEDC, 'M', u'ຫນ'),
+ (0xEDD, 'M', u'ຫມ'),
+ (0xEDE, 'V'),
+ (0xEE0, 'X'),
+ (0xF00, 'V'),
+ (0xF0C, 'M', u'་'),
+ (0xF0D, 'V'),
+ (0xF43, 'M', u'གྷ'),
+ (0xF44, 'V'),
+ (0xF48, 'X'),
+ (0xF49, 'V'),
+ (0xF4D, 'M', u'ཌྷ'),
+ (0xF4E, 'V'),
+ (0xF52, 'M', u'དྷ'),
+ (0xF53, 'V'),
+ (0xF57, 'M', u'བྷ'),
+ (0xF58, 'V'),
+ (0xF5C, 'M', u'ཛྷ'),
+ (0xF5D, 'V'),
+ (0xF69, 'M', u'ཀྵ'),
+ (0xF6A, 'V'),
+ (0xF6D, 'X'),
+ (0xF71, 'V'),
+ (0xF73, 'M', u'ཱི'),
+ (0xF74, 'V'),
+ (0xF75, 'M', u'ཱུ'),
+ (0xF76, 'M', u'ྲྀ'),
+ (0xF77, 'M', u'ྲཱྀ'),
+ (0xF78, 'M', u'ླྀ'),
+ (0xF79, 'M', u'ླཱྀ'),
+ (0xF7A, 'V'),
+ (0xF81, 'M', u'ཱྀ'),
+ (0xF82, 'V'),
+ (0xF93, 'M', u'ྒྷ'),
+ (0xF94, 'V'),
+ (0xF98, 'X'),
+ (0xF99, 'V'),
+ (0xF9D, 'M', u'ྜྷ'),
+ (0xF9E, 'V'),
+ (0xFA2, 'M', u'ྡྷ'),
+ (0xFA3, 'V'),
+ (0xFA7, 'M', u'ྦྷ'),
+ (0xFA8, 'V'),
+ (0xFAC, 'M', u'ྫྷ'),
+ (0xFAD, 'V'),
+ (0xFB9, 'M', u'ྐྵ'),
+ (0xFBA, 'V'),
+ (0xFBD, 'X'),
+ (0xFBE, 'V'),
+ (0xFCD, 'X'),
+ (0xFCE, 'V'),
+ (0xFDB, 'X'),
+ (0x1000, 'V'),
+ (0x10A0, 'X'),
+ (0x10C7, 'M', u'ⴧ'),
+ (0x10C8, 'X'),
+ (0x10CD, 'M', u'ⴭ'),
+ (0x10CE, 'X'),
+ (0x10D0, 'V'),
+ (0x10FC, 'M', u'ნ'),
+ (0x10FD, 'V'),
+ (0x115F, 'X'),
+ (0x1161, 'V'),
+ (0x1249, 'X'),
+ (0x124A, 'V'),
+ (0x124E, 'X'),
+ (0x1250, 'V'),
+ (0x1257, 'X'),
+ (0x1258, 'V'),
+ (0x1259, 'X'),
+ (0x125A, 'V'),
+ (0x125E, 'X'),
+ (0x1260, 'V'),
+ (0x1289, 'X'),
+ (0x128A, 'V'),
+ (0x128E, 'X'),
+ (0x1290, 'V'),
+ (0x12B1, 'X'),
+ (0x12B2, 'V'),
+ (0x12B6, 'X'),
+ (0x12B8, 'V'),
+ (0x12BF, 'X'),
+ (0x12C0, 'V'),
+ (0x12C1, 'X'),
+ (0x12C2, 'V'),
+ (0x12C6, 'X'),
+ (0x12C8, 'V'),
+ (0x12D7, 'X'),
+ (0x12D8, 'V'),
+ (0x1311, 'X'),
+ (0x1312, 'V'),
+ ]
+
+def _seg_14():
+ return [
+ (0x1316, 'X'),
+ (0x1318, 'V'),
+ (0x135B, 'X'),
+ (0x135D, 'V'),
+ (0x137D, 'X'),
+ (0x1380, 'V'),
+ (0x139A, 'X'),
+ (0x13A0, 'V'),
+ (0x13F6, 'X'),
+ (0x13F8, 'M', u'Ᏸ'),
+ (0x13F9, 'M', u'Ᏹ'),
+ (0x13FA, 'M', u'Ᏺ'),
+ (0x13FB, 'M', u'Ᏻ'),
+ (0x13FC, 'M', u'Ᏼ'),
+ (0x13FD, 'M', u'Ᏽ'),
+ (0x13FE, 'X'),
+ (0x1400, 'V'),
+ (0x1680, 'X'),
+ (0x1681, 'V'),
+ (0x169D, 'X'),
+ (0x16A0, 'V'),
+ (0x16F9, 'X'),
+ (0x1700, 'V'),
+ (0x170D, 'X'),
+ (0x170E, 'V'),
+ (0x1715, 'X'),
+ (0x1720, 'V'),
+ (0x1737, 'X'),
+ (0x1740, 'V'),
+ (0x1754, 'X'),
+ (0x1760, 'V'),
+ (0x176D, 'X'),
+ (0x176E, 'V'),
+ (0x1771, 'X'),
+ (0x1772, 'V'),
+ (0x1774, 'X'),
+ (0x1780, 'V'),
+ (0x17B4, 'X'),
+ (0x17B6, 'V'),
+ (0x17DE, 'X'),
+ (0x17E0, 'V'),
+ (0x17EA, 'X'),
+ (0x17F0, 'V'),
+ (0x17FA, 'X'),
+ (0x1800, 'V'),
+ (0x1806, 'X'),
+ (0x1807, 'V'),
+ (0x180B, 'I'),
+ (0x180E, 'X'),
+ (0x1810, 'V'),
+ (0x181A, 'X'),
+ (0x1820, 'V'),
+ (0x1879, 'X'),
+ (0x1880, 'V'),
+ (0x18AB, 'X'),
+ (0x18B0, 'V'),
+ (0x18F6, 'X'),
+ (0x1900, 'V'),
+ (0x191F, 'X'),
+ (0x1920, 'V'),
+ (0x192C, 'X'),
+ (0x1930, 'V'),
+ (0x193C, 'X'),
+ (0x1940, 'V'),
+ (0x1941, 'X'),
+ (0x1944, 'V'),
+ (0x196E, 'X'),
+ (0x1970, 'V'),
+ (0x1975, 'X'),
+ (0x1980, 'V'),
+ (0x19AC, 'X'),
+ (0x19B0, 'V'),
+ (0x19CA, 'X'),
+ (0x19D0, 'V'),
+ (0x19DB, 'X'),
+ (0x19DE, 'V'),
+ (0x1A1C, 'X'),
+ (0x1A1E, 'V'),
+ (0x1A5F, 'X'),
+ (0x1A60, 'V'),
+ (0x1A7D, 'X'),
+ (0x1A7F, 'V'),
+ (0x1A8A, 'X'),
+ (0x1A90, 'V'),
+ (0x1A9A, 'X'),
+ (0x1AA0, 'V'),
+ (0x1AAE, 'X'),
+ (0x1AB0, 'V'),
+ (0x1AC1, 'X'),
+ (0x1B00, 'V'),
+ (0x1B4C, 'X'),
+ (0x1B50, 'V'),
+ (0x1B7D, 'X'),
+ (0x1B80, 'V'),
+ (0x1BF4, 'X'),
+ (0x1BFC, 'V'),
+ (0x1C38, 'X'),
+ (0x1C3B, 'V'),
+ (0x1C4A, 'X'),
+ (0x1C4D, 'V'),
+ ]
+
+def _seg_15():
+ return [
+ (0x1C80, 'M', u'в'),
+ (0x1C81, 'M', u'д'),
+ (0x1C82, 'M', u'о'),
+ (0x1C83, 'M', u'с'),
+ (0x1C84, 'M', u'т'),
+ (0x1C86, 'M', u'ъ'),
+ (0x1C87, 'M', u'ѣ'),
+ (0x1C88, 'M', u'ꙋ'),
+ (0x1C89, 'X'),
+ (0x1C90, 'M', u'ა'),
+ (0x1C91, 'M', u'ბ'),
+ (0x1C92, 'M', u'გ'),
+ (0x1C93, 'M', u'დ'),
+ (0x1C94, 'M', u'ე'),
+ (0x1C95, 'M', u'ვ'),
+ (0x1C96, 'M', u'ზ'),
+ (0x1C97, 'M', u'თ'),
+ (0x1C98, 'M', u'ი'),
+ (0x1C99, 'M', u'კ'),
+ (0x1C9A, 'M', u'ლ'),
+ (0x1C9B, 'M', u'მ'),
+ (0x1C9C, 'M', u'ნ'),
+ (0x1C9D, 'M', u'ო'),
+ (0x1C9E, 'M', u'პ'),
+ (0x1C9F, 'M', u'ჟ'),
+ (0x1CA0, 'M', u'რ'),
+ (0x1CA1, 'M', u'ს'),
+ (0x1CA2, 'M', u'ტ'),
+ (0x1CA3, 'M', u'უ'),
+ (0x1CA4, 'M', u'ფ'),
+ (0x1CA5, 'M', u'ქ'),
+ (0x1CA6, 'M', u'ღ'),
+ (0x1CA7, 'M', u'ყ'),
+ (0x1CA8, 'M', u'შ'),
+ (0x1CA9, 'M', u'ჩ'),
+ (0x1CAA, 'M', u'ც'),
+ (0x1CAB, 'M', u'ძ'),
+ (0x1CAC, 'M', u'წ'),
+ (0x1CAD, 'M', u'ჭ'),
+ (0x1CAE, 'M', u'ხ'),
+ (0x1CAF, 'M', u'ჯ'),
+ (0x1CB0, 'M', u'ჰ'),
+ (0x1CB1, 'M', u'ჱ'),
+ (0x1CB2, 'M', u'ჲ'),
+ (0x1CB3, 'M', u'ჳ'),
+ (0x1CB4, 'M', u'ჴ'),
+ (0x1CB5, 'M', u'ჵ'),
+ (0x1CB6, 'M', u'ჶ'),
+ (0x1CB7, 'M', u'ჷ'),
+ (0x1CB8, 'M', u'ჸ'),
+ (0x1CB9, 'M', u'ჹ'),
+ (0x1CBA, 'M', u'ჺ'),
+ (0x1CBB, 'X'),
+ (0x1CBD, 'M', u'ჽ'),
+ (0x1CBE, 'M', u'ჾ'),
+ (0x1CBF, 'M', u'ჿ'),
+ (0x1CC0, 'V'),
+ (0x1CC8, 'X'),
+ (0x1CD0, 'V'),
+ (0x1CFB, 'X'),
+ (0x1D00, 'V'),
+ (0x1D2C, 'M', u'a'),
+ (0x1D2D, 'M', u'æ'),
+ (0x1D2E, 'M', u'b'),
+ (0x1D2F, 'V'),
+ (0x1D30, 'M', u'd'),
+ (0x1D31, 'M', u'e'),
+ (0x1D32, 'M', u'ǝ'),
+ (0x1D33, 'M', u'g'),
+ (0x1D34, 'M', u'h'),
+ (0x1D35, 'M', u'i'),
+ (0x1D36, 'M', u'j'),
+ (0x1D37, 'M', u'k'),
+ (0x1D38, 'M', u'l'),
+ (0x1D39, 'M', u'm'),
+ (0x1D3A, 'M', u'n'),
+ (0x1D3B, 'V'),
+ (0x1D3C, 'M', u'o'),
+ (0x1D3D, 'M', u'ȣ'),
+ (0x1D3E, 'M', u'p'),
+ (0x1D3F, 'M', u'r'),
+ (0x1D40, 'M', u't'),
+ (0x1D41, 'M', u'u'),
+ (0x1D42, 'M', u'w'),
+ (0x1D43, 'M', u'a'),
+ (0x1D44, 'M', u'ɐ'),
+ (0x1D45, 'M', u'ɑ'),
+ (0x1D46, 'M', u'ᴂ'),
+ (0x1D47, 'M', u'b'),
+ (0x1D48, 'M', u'd'),
+ (0x1D49, 'M', u'e'),
+ (0x1D4A, 'M', u'ə'),
+ (0x1D4B, 'M', u'ɛ'),
+ (0x1D4C, 'M', u'ɜ'),
+ (0x1D4D, 'M', u'g'),
+ (0x1D4E, 'V'),
+ (0x1D4F, 'M', u'k'),
+ (0x1D50, 'M', u'm'),
+ (0x1D51, 'M', u'ŋ'),
+ (0x1D52, 'M', u'o'),
+ ]
+
+def _seg_16():
+ return [
+ (0x1D53, 'M', u'ɔ'),
+ (0x1D54, 'M', u'ᴖ'),
+ (0x1D55, 'M', u'ᴗ'),
+ (0x1D56, 'M', u'p'),
+ (0x1D57, 'M', u't'),
+ (0x1D58, 'M', u'u'),
+ (0x1D59, 'M', u'ᴝ'),
+ (0x1D5A, 'M', u'ɯ'),
+ (0x1D5B, 'M', u'v'),
+ (0x1D5C, 'M', u'ᴥ'),
+ (0x1D5D, 'M', u'β'),
+ (0x1D5E, 'M', u'γ'),
+ (0x1D5F, 'M', u'δ'),
+ (0x1D60, 'M', u'φ'),
+ (0x1D61, 'M', u'χ'),
+ (0x1D62, 'M', u'i'),
+ (0x1D63, 'M', u'r'),
+ (0x1D64, 'M', u'u'),
+ (0x1D65, 'M', u'v'),
+ (0x1D66, 'M', u'β'),
+ (0x1D67, 'M', u'γ'),
+ (0x1D68, 'M', u'ρ'),
+ (0x1D69, 'M', u'φ'),
+ (0x1D6A, 'M', u'χ'),
+ (0x1D6B, 'V'),
+ (0x1D78, 'M', u'н'),
+ (0x1D79, 'V'),
+ (0x1D9B, 'M', u'ɒ'),
+ (0x1D9C, 'M', u'c'),
+ (0x1D9D, 'M', u'ɕ'),
+ (0x1D9E, 'M', u'ð'),
+ (0x1D9F, 'M', u'ɜ'),
+ (0x1DA0, 'M', u'f'),
+ (0x1DA1, 'M', u'ɟ'),
+ (0x1DA2, 'M', u'ɡ'),
+ (0x1DA3, 'M', u'ɥ'),
+ (0x1DA4, 'M', u'ɨ'),
+ (0x1DA5, 'M', u'ɩ'),
+ (0x1DA6, 'M', u'ɪ'),
+ (0x1DA7, 'M', u'ᵻ'),
+ (0x1DA8, 'M', u'ʝ'),
+ (0x1DA9, 'M', u'ɭ'),
+ (0x1DAA, 'M', u'ᶅ'),
+ (0x1DAB, 'M', u'ʟ'),
+ (0x1DAC, 'M', u'ɱ'),
+ (0x1DAD, 'M', u'ɰ'),
+ (0x1DAE, 'M', u'ɲ'),
+ (0x1DAF, 'M', u'ɳ'),
+ (0x1DB0, 'M', u'ɴ'),
+ (0x1DB1, 'M', u'ɵ'),
+ (0x1DB2, 'M', u'ɸ'),
+ (0x1DB3, 'M', u'ʂ'),
+ (0x1DB4, 'M', u'ʃ'),
+ (0x1DB5, 'M', u'ƫ'),
+ (0x1DB6, 'M', u'ʉ'),
+ (0x1DB7, 'M', u'ʊ'),
+ (0x1DB8, 'M', u'ᴜ'),
+ (0x1DB9, 'M', u'ʋ'),
+ (0x1DBA, 'M', u'ʌ'),
+ (0x1DBB, 'M', u'z'),
+ (0x1DBC, 'M', u'ʐ'),
+ (0x1DBD, 'M', u'ʑ'),
+ (0x1DBE, 'M', u'ʒ'),
+ (0x1DBF, 'M', u'θ'),
+ (0x1DC0, 'V'),
+ (0x1DFA, 'X'),
+ (0x1DFB, 'V'),
+ (0x1E00, 'M', u'ḁ'),
+ (0x1E01, 'V'),
+ (0x1E02, 'M', u'ḃ'),
+ (0x1E03, 'V'),
+ (0x1E04, 'M', u'ḅ'),
+ (0x1E05, 'V'),
+ (0x1E06, 'M', u'ḇ'),
+ (0x1E07, 'V'),
+ (0x1E08, 'M', u'ḉ'),
+ (0x1E09, 'V'),
+ (0x1E0A, 'M', u'ḋ'),
+ (0x1E0B, 'V'),
+ (0x1E0C, 'M', u'ḍ'),
+ (0x1E0D, 'V'),
+ (0x1E0E, 'M', u'ḏ'),
+ (0x1E0F, 'V'),
+ (0x1E10, 'M', u'ḑ'),
+ (0x1E11, 'V'),
+ (0x1E12, 'M', u'ḓ'),
+ (0x1E13, 'V'),
+ (0x1E14, 'M', u'ḕ'),
+ (0x1E15, 'V'),
+ (0x1E16, 'M', u'ḗ'),
+ (0x1E17, 'V'),
+ (0x1E18, 'M', u'ḙ'),
+ (0x1E19, 'V'),
+ (0x1E1A, 'M', u'ḛ'),
+ (0x1E1B, 'V'),
+ (0x1E1C, 'M', u'ḝ'),
+ (0x1E1D, 'V'),
+ (0x1E1E, 'M', u'ḟ'),
+ (0x1E1F, 'V'),
+ (0x1E20, 'M', u'ḡ'),
+ ]
+
+def _seg_17():
+ return [
+ (0x1E21, 'V'),
+ (0x1E22, 'M', u'ḣ'),
+ (0x1E23, 'V'),
+ (0x1E24, 'M', u'ḥ'),
+ (0x1E25, 'V'),
+ (0x1E26, 'M', u'ḧ'),
+ (0x1E27, 'V'),
+ (0x1E28, 'M', u'ḩ'),
+ (0x1E29, 'V'),
+ (0x1E2A, 'M', u'ḫ'),
+ (0x1E2B, 'V'),
+ (0x1E2C, 'M', u'ḭ'),
+ (0x1E2D, 'V'),
+ (0x1E2E, 'M', u'ḯ'),
+ (0x1E2F, 'V'),
+ (0x1E30, 'M', u'ḱ'),
+ (0x1E31, 'V'),
+ (0x1E32, 'M', u'ḳ'),
+ (0x1E33, 'V'),
+ (0x1E34, 'M', u'ḵ'),
+ (0x1E35, 'V'),
+ (0x1E36, 'M', u'ḷ'),
+ (0x1E37, 'V'),
+ (0x1E38, 'M', u'ḹ'),
+ (0x1E39, 'V'),
+ (0x1E3A, 'M', u'ḻ'),
+ (0x1E3B, 'V'),
+ (0x1E3C, 'M', u'ḽ'),
+ (0x1E3D, 'V'),
+ (0x1E3E, 'M', u'ḿ'),
+ (0x1E3F, 'V'),
+ (0x1E40, 'M', u'ṁ'),
+ (0x1E41, 'V'),
+ (0x1E42, 'M', u'ṃ'),
+ (0x1E43, 'V'),
+ (0x1E44, 'M', u'ṅ'),
+ (0x1E45, 'V'),
+ (0x1E46, 'M', u'ṇ'),
+ (0x1E47, 'V'),
+ (0x1E48, 'M', u'ṉ'),
+ (0x1E49, 'V'),
+ (0x1E4A, 'M', u'ṋ'),
+ (0x1E4B, 'V'),
+ (0x1E4C, 'M', u'ṍ'),
+ (0x1E4D, 'V'),
+ (0x1E4E, 'M', u'ṏ'),
+ (0x1E4F, 'V'),
+ (0x1E50, 'M', u'ṑ'),
+ (0x1E51, 'V'),
+ (0x1E52, 'M', u'ṓ'),
+ (0x1E53, 'V'),
+ (0x1E54, 'M', u'ṕ'),
+ (0x1E55, 'V'),
+ (0x1E56, 'M', u'ṗ'),
+ (0x1E57, 'V'),
+ (0x1E58, 'M', u'ṙ'),
+ (0x1E59, 'V'),
+ (0x1E5A, 'M', u'ṛ'),
+ (0x1E5B, 'V'),
+ (0x1E5C, 'M', u'ṝ'),
+ (0x1E5D, 'V'),
+ (0x1E5E, 'M', u'ṟ'),
+ (0x1E5F, 'V'),
+ (0x1E60, 'M', u'ṡ'),
+ (0x1E61, 'V'),
+ (0x1E62, 'M', u'ṣ'),
+ (0x1E63, 'V'),
+ (0x1E64, 'M', u'ṥ'),
+ (0x1E65, 'V'),
+ (0x1E66, 'M', u'ṧ'),
+ (0x1E67, 'V'),
+ (0x1E68, 'M', u'ṩ'),
+ (0x1E69, 'V'),
+ (0x1E6A, 'M', u'ṫ'),
+ (0x1E6B, 'V'),
+ (0x1E6C, 'M', u'ṭ'),
+ (0x1E6D, 'V'),
+ (0x1E6E, 'M', u'ṯ'),
+ (0x1E6F, 'V'),
+ (0x1E70, 'M', u'ṱ'),
+ (0x1E71, 'V'),
+ (0x1E72, 'M', u'ṳ'),
+ (0x1E73, 'V'),
+ (0x1E74, 'M', u'ṵ'),
+ (0x1E75, 'V'),
+ (0x1E76, 'M', u'ṷ'),
+ (0x1E77, 'V'),
+ (0x1E78, 'M', u'ṹ'),
+ (0x1E79, 'V'),
+ (0x1E7A, 'M', u'ṻ'),
+ (0x1E7B, 'V'),
+ (0x1E7C, 'M', u'ṽ'),
+ (0x1E7D, 'V'),
+ (0x1E7E, 'M', u'ṿ'),
+ (0x1E7F, 'V'),
+ (0x1E80, 'M', u'ẁ'),
+ (0x1E81, 'V'),
+ (0x1E82, 'M', u'ẃ'),
+ (0x1E83, 'V'),
+ (0x1E84, 'M', u'ẅ'),
+ ]
+
+def _seg_18():
+ return [
+ (0x1E85, 'V'),
+ (0x1E86, 'M', u'ẇ'),
+ (0x1E87, 'V'),
+ (0x1E88, 'M', u'ẉ'),
+ (0x1E89, 'V'),
+ (0x1E8A, 'M', u'ẋ'),
+ (0x1E8B, 'V'),
+ (0x1E8C, 'M', u'ẍ'),
+ (0x1E8D, 'V'),
+ (0x1E8E, 'M', u'ẏ'),
+ (0x1E8F, 'V'),
+ (0x1E90, 'M', u'ẑ'),
+ (0x1E91, 'V'),
+ (0x1E92, 'M', u'ẓ'),
+ (0x1E93, 'V'),
+ (0x1E94, 'M', u'ẕ'),
+ (0x1E95, 'V'),
+ (0x1E9A, 'M', u'aʾ'),
+ (0x1E9B, 'M', u'ṡ'),
+ (0x1E9C, 'V'),
+ (0x1E9E, 'M', u'ss'),
+ (0x1E9F, 'V'),
+ (0x1EA0, 'M', u'ạ'),
+ (0x1EA1, 'V'),
+ (0x1EA2, 'M', u'ả'),
+ (0x1EA3, 'V'),
+ (0x1EA4, 'M', u'ấ'),
+ (0x1EA5, 'V'),
+ (0x1EA6, 'M', u'ầ'),
+ (0x1EA7, 'V'),
+ (0x1EA8, 'M', u'ẩ'),
+ (0x1EA9, 'V'),
+ (0x1EAA, 'M', u'ẫ'),
+ (0x1EAB, 'V'),
+ (0x1EAC, 'M', u'ậ'),
+ (0x1EAD, 'V'),
+ (0x1EAE, 'M', u'ắ'),
+ (0x1EAF, 'V'),
+ (0x1EB0, 'M', u'ằ'),
+ (0x1EB1, 'V'),
+ (0x1EB2, 'M', u'ẳ'),
+ (0x1EB3, 'V'),
+ (0x1EB4, 'M', u'ẵ'),
+ (0x1EB5, 'V'),
+ (0x1EB6, 'M', u'ặ'),
+ (0x1EB7, 'V'),
+ (0x1EB8, 'M', u'ẹ'),
+ (0x1EB9, 'V'),
+ (0x1EBA, 'M', u'ẻ'),
+ (0x1EBB, 'V'),
+ (0x1EBC, 'M', u'ẽ'),
+ (0x1EBD, 'V'),
+ (0x1EBE, 'M', u'ế'),
+ (0x1EBF, 'V'),
+ (0x1EC0, 'M', u'ề'),
+ (0x1EC1, 'V'),
+ (0x1EC2, 'M', u'ể'),
+ (0x1EC3, 'V'),
+ (0x1EC4, 'M', u'ễ'),
+ (0x1EC5, 'V'),
+ (0x1EC6, 'M', u'ệ'),
+ (0x1EC7, 'V'),
+ (0x1EC8, 'M', u'ỉ'),
+ (0x1EC9, 'V'),
+ (0x1ECA, 'M', u'ị'),
+ (0x1ECB, 'V'),
+ (0x1ECC, 'M', u'ọ'),
+ (0x1ECD, 'V'),
+ (0x1ECE, 'M', u'ỏ'),
+ (0x1ECF, 'V'),
+ (0x1ED0, 'M', u'ố'),
+ (0x1ED1, 'V'),
+ (0x1ED2, 'M', u'ồ'),
+ (0x1ED3, 'V'),
+ (0x1ED4, 'M', u'ổ'),
+ (0x1ED5, 'V'),
+ (0x1ED6, 'M', u'ỗ'),
+ (0x1ED7, 'V'),
+ (0x1ED8, 'M', u'ộ'),
+ (0x1ED9, 'V'),
+ (0x1EDA, 'M', u'ớ'),
+ (0x1EDB, 'V'),
+ (0x1EDC, 'M', u'ờ'),
+ (0x1EDD, 'V'),
+ (0x1EDE, 'M', u'ở'),
+ (0x1EDF, 'V'),
+ (0x1EE0, 'M', u'ỡ'),
+ (0x1EE1, 'V'),
+ (0x1EE2, 'M', u'ợ'),
+ (0x1EE3, 'V'),
+ (0x1EE4, 'M', u'ụ'),
+ (0x1EE5, 'V'),
+ (0x1EE6, 'M', u'ủ'),
+ (0x1EE7, 'V'),
+ (0x1EE8, 'M', u'ứ'),
+ (0x1EE9, 'V'),
+ (0x1EEA, 'M', u'ừ'),
+ (0x1EEB, 'V'),
+ (0x1EEC, 'M', u'ử'),
+ (0x1EED, 'V'),
+ ]
+
+def _seg_19():
+ return [
+ (0x1EEE, 'M', u'ữ'),
+ (0x1EEF, 'V'),
+ (0x1EF0, 'M', u'ự'),
+ (0x1EF1, 'V'),
+ (0x1EF2, 'M', u'ỳ'),
+ (0x1EF3, 'V'),
+ (0x1EF4, 'M', u'ỵ'),
+ (0x1EF5, 'V'),
+ (0x1EF6, 'M', u'ỷ'),
+ (0x1EF7, 'V'),
+ (0x1EF8, 'M', u'ỹ'),
+ (0x1EF9, 'V'),
+ (0x1EFA, 'M', u'ỻ'),
+ (0x1EFB, 'V'),
+ (0x1EFC, 'M', u'ỽ'),
+ (0x1EFD, 'V'),
+ (0x1EFE, 'M', u'ỿ'),
+ (0x1EFF, 'V'),
+ (0x1F08, 'M', u'ἀ'),
+ (0x1F09, 'M', u'ἁ'),
+ (0x1F0A, 'M', u'ἂ'),
+ (0x1F0B, 'M', u'ἃ'),
+ (0x1F0C, 'M', u'ἄ'),
+ (0x1F0D, 'M', u'ἅ'),
+ (0x1F0E, 'M', u'ἆ'),
+ (0x1F0F, 'M', u'ἇ'),
+ (0x1F10, 'V'),
+ (0x1F16, 'X'),
+ (0x1F18, 'M', u'ἐ'),
+ (0x1F19, 'M', u'ἑ'),
+ (0x1F1A, 'M', u'ἒ'),
+ (0x1F1B, 'M', u'ἓ'),
+ (0x1F1C, 'M', u'ἔ'),
+ (0x1F1D, 'M', u'ἕ'),
+ (0x1F1E, 'X'),
+ (0x1F20, 'V'),
+ (0x1F28, 'M', u'ἠ'),
+ (0x1F29, 'M', u'ἡ'),
+ (0x1F2A, 'M', u'ἢ'),
+ (0x1F2B, 'M', u'ἣ'),
+ (0x1F2C, 'M', u'ἤ'),
+ (0x1F2D, 'M', u'ἥ'),
+ (0x1F2E, 'M', u'ἦ'),
+ (0x1F2F, 'M', u'ἧ'),
+ (0x1F30, 'V'),
+ (0x1F38, 'M', u'ἰ'),
+ (0x1F39, 'M', u'ἱ'),
+ (0x1F3A, 'M', u'ἲ'),
+ (0x1F3B, 'M', u'ἳ'),
+ (0x1F3C, 'M', u'ἴ'),
+ (0x1F3D, 'M', u'ἵ'),
+ (0x1F3E, 'M', u'ἶ'),
+ (0x1F3F, 'M', u'ἷ'),
+ (0x1F40, 'V'),
+ (0x1F46, 'X'),
+ (0x1F48, 'M', u'ὀ'),
+ (0x1F49, 'M', u'ὁ'),
+ (0x1F4A, 'M', u'ὂ'),
+ (0x1F4B, 'M', u'ὃ'),
+ (0x1F4C, 'M', u'ὄ'),
+ (0x1F4D, 'M', u'ὅ'),
+ (0x1F4E, 'X'),
+ (0x1F50, 'V'),
+ (0x1F58, 'X'),
+ (0x1F59, 'M', u'ὑ'),
+ (0x1F5A, 'X'),
+ (0x1F5B, 'M', u'ὓ'),
+ (0x1F5C, 'X'),
+ (0x1F5D, 'M', u'ὕ'),
+ (0x1F5E, 'X'),
+ (0x1F5F, 'M', u'ὗ'),
+ (0x1F60, 'V'),
+ (0x1F68, 'M', u'ὠ'),
+ (0x1F69, 'M', u'ὡ'),
+ (0x1F6A, 'M', u'ὢ'),
+ (0x1F6B, 'M', u'ὣ'),
+ (0x1F6C, 'M', u'ὤ'),
+ (0x1F6D, 'M', u'ὥ'),
+ (0x1F6E, 'M', u'ὦ'),
+ (0x1F6F, 'M', u'ὧ'),
+ (0x1F70, 'V'),
+ (0x1F71, 'M', u'ά'),
+ (0x1F72, 'V'),
+ (0x1F73, 'M', u'έ'),
+ (0x1F74, 'V'),
+ (0x1F75, 'M', u'ή'),
+ (0x1F76, 'V'),
+ (0x1F77, 'M', u'ί'),
+ (0x1F78, 'V'),
+ (0x1F79, 'M', u'ό'),
+ (0x1F7A, 'V'),
+ (0x1F7B, 'M', u'ύ'),
+ (0x1F7C, 'V'),
+ (0x1F7D, 'M', u'ώ'),
+ (0x1F7E, 'X'),
+ (0x1F80, 'M', u'ἀι'),
+ (0x1F81, 'M', u'ἁι'),
+ (0x1F82, 'M', u'ἂι'),
+ (0x1F83, 'M', u'ἃι'),
+ (0x1F84, 'M', u'ἄι'),
+ ]
+
+def _seg_20():
+ return [
+ (0x1F85, 'M', u'ἅι'),
+ (0x1F86, 'M', u'ἆι'),
+ (0x1F87, 'M', u'ἇι'),
+ (0x1F88, 'M', u'ἀι'),
+ (0x1F89, 'M', u'ἁι'),
+ (0x1F8A, 'M', u'ἂι'),
+ (0x1F8B, 'M', u'ἃι'),
+ (0x1F8C, 'M', u'ἄι'),
+ (0x1F8D, 'M', u'ἅι'),
+ (0x1F8E, 'M', u'ἆι'),
+ (0x1F8F, 'M', u'ἇι'),
+ (0x1F90, 'M', u'ἠι'),
+ (0x1F91, 'M', u'ἡι'),
+ (0x1F92, 'M', u'ἢι'),
+ (0x1F93, 'M', u'ἣι'),
+ (0x1F94, 'M', u'ἤι'),
+ (0x1F95, 'M', u'ἥι'),
+ (0x1F96, 'M', u'ἦι'),
+ (0x1F97, 'M', u'ἧι'),
+ (0x1F98, 'M', u'ἠι'),
+ (0x1F99, 'M', u'ἡι'),
+ (0x1F9A, 'M', u'ἢι'),
+ (0x1F9B, 'M', u'ἣι'),
+ (0x1F9C, 'M', u'ἤι'),
+ (0x1F9D, 'M', u'ἥι'),
+ (0x1F9E, 'M', u'ἦι'),
+ (0x1F9F, 'M', u'ἧι'),
+ (0x1FA0, 'M', u'ὠι'),
+ (0x1FA1, 'M', u'ὡι'),
+ (0x1FA2, 'M', u'ὢι'),
+ (0x1FA3, 'M', u'ὣι'),
+ (0x1FA4, 'M', u'ὤι'),
+ (0x1FA5, 'M', u'ὥι'),
+ (0x1FA6, 'M', u'ὦι'),
+ (0x1FA7, 'M', u'ὧι'),
+ (0x1FA8, 'M', u'ὠι'),
+ (0x1FA9, 'M', u'ὡι'),
+ (0x1FAA, 'M', u'ὢι'),
+ (0x1FAB, 'M', u'ὣι'),
+ (0x1FAC, 'M', u'ὤι'),
+ (0x1FAD, 'M', u'ὥι'),
+ (0x1FAE, 'M', u'ὦι'),
+ (0x1FAF, 'M', u'ὧι'),
+ (0x1FB0, 'V'),
+ (0x1FB2, 'M', u'ὰι'),
+ (0x1FB3, 'M', u'αι'),
+ (0x1FB4, 'M', u'άι'),
+ (0x1FB5, 'X'),
+ (0x1FB6, 'V'),
+ (0x1FB7, 'M', u'ᾶι'),
+ (0x1FB8, 'M', u'ᾰ'),
+ (0x1FB9, 'M', u'ᾱ'),
+ (0x1FBA, 'M', u'ὰ'),
+ (0x1FBB, 'M', u'ά'),
+ (0x1FBC, 'M', u'αι'),
+ (0x1FBD, '3', u' ̓'),
+ (0x1FBE, 'M', u'ι'),
+ (0x1FBF, '3', u' ̓'),
+ (0x1FC0, '3', u' ͂'),
+ (0x1FC1, '3', u' ̈͂'),
+ (0x1FC2, 'M', u'ὴι'),
+ (0x1FC3, 'M', u'ηι'),
+ (0x1FC4, 'M', u'ήι'),
+ (0x1FC5, 'X'),
+ (0x1FC6, 'V'),
+ (0x1FC7, 'M', u'ῆι'),
+ (0x1FC8, 'M', u'ὲ'),
+ (0x1FC9, 'M', u'έ'),
+ (0x1FCA, 'M', u'ὴ'),
+ (0x1FCB, 'M', u'ή'),
+ (0x1FCC, 'M', u'ηι'),
+ (0x1FCD, '3', u' ̓̀'),
+ (0x1FCE, '3', u' ̓́'),
+ (0x1FCF, '3', u' ̓͂'),
+ (0x1FD0, 'V'),
+ (0x1FD3, 'M', u'ΐ'),
+ (0x1FD4, 'X'),
+ (0x1FD6, 'V'),
+ (0x1FD8, 'M', u'ῐ'),
+ (0x1FD9, 'M', u'ῑ'),
+ (0x1FDA, 'M', u'ὶ'),
+ (0x1FDB, 'M', u'ί'),
+ (0x1FDC, 'X'),
+ (0x1FDD, '3', u' ̔̀'),
+ (0x1FDE, '3', u' ̔́'),
+ (0x1FDF, '3', u' ̔͂'),
+ (0x1FE0, 'V'),
+ (0x1FE3, 'M', u'ΰ'),
+ (0x1FE4, 'V'),
+ (0x1FE8, 'M', u'ῠ'),
+ (0x1FE9, 'M', u'ῡ'),
+ (0x1FEA, 'M', u'ὺ'),
+ (0x1FEB, 'M', u'ύ'),
+ (0x1FEC, 'M', u'ῥ'),
+ (0x1FED, '3', u' ̈̀'),
+ (0x1FEE, '3', u' ̈́'),
+ (0x1FEF, '3', u'`'),
+ (0x1FF0, 'X'),
+ (0x1FF2, 'M', u'ὼι'),
+ (0x1FF3, 'M', u'ωι'),
+ ]
+
+def _seg_21():
+ return [
+ (0x1FF4, 'M', u'ώι'),
+ (0x1FF5, 'X'),
+ (0x1FF6, 'V'),
+ (0x1FF7, 'M', u'ῶι'),
+ (0x1FF8, 'M', u'ὸ'),
+ (0x1FF9, 'M', u'ό'),
+ (0x1FFA, 'M', u'ὼ'),
+ (0x1FFB, 'M', u'ώ'),
+ (0x1FFC, 'M', u'ωι'),
+ (0x1FFD, '3', u' ́'),
+ (0x1FFE, '3', u' ̔'),
+ (0x1FFF, 'X'),
+ (0x2000, '3', u' '),
+ (0x200B, 'I'),
+ (0x200C, 'D', u''),
+ (0x200E, 'X'),
+ (0x2010, 'V'),
+ (0x2011, 'M', u'‐'),
+ (0x2012, 'V'),
+ (0x2017, '3', u' ̳'),
+ (0x2018, 'V'),
+ (0x2024, 'X'),
+ (0x2027, 'V'),
+ (0x2028, 'X'),
+ (0x202F, '3', u' '),
+ (0x2030, 'V'),
+ (0x2033, 'M', u'′′'),
+ (0x2034, 'M', u'′′′'),
+ (0x2035, 'V'),
+ (0x2036, 'M', u'‵‵'),
+ (0x2037, 'M', u'‵‵‵'),
+ (0x2038, 'V'),
+ (0x203C, '3', u'!!'),
+ (0x203D, 'V'),
+ (0x203E, '3', u' ̅'),
+ (0x203F, 'V'),
+ (0x2047, '3', u'??'),
+ (0x2048, '3', u'?!'),
+ (0x2049, '3', u'!?'),
+ (0x204A, 'V'),
+ (0x2057, 'M', u'′′′′'),
+ (0x2058, 'V'),
+ (0x205F, '3', u' '),
+ (0x2060, 'I'),
+ (0x2061, 'X'),
+ (0x2064, 'I'),
+ (0x2065, 'X'),
+ (0x2070, 'M', u'0'),
+ (0x2071, 'M', u'i'),
+ (0x2072, 'X'),
+ (0x2074, 'M', u'4'),
+ (0x2075, 'M', u'5'),
+ (0x2076, 'M', u'6'),
+ (0x2077, 'M', u'7'),
+ (0x2078, 'M', u'8'),
+ (0x2079, 'M', u'9'),
+ (0x207A, '3', u'+'),
+ (0x207B, 'M', u'−'),
+ (0x207C, '3', u'='),
+ (0x207D, '3', u'('),
+ (0x207E, '3', u')'),
+ (0x207F, 'M', u'n'),
+ (0x2080, 'M', u'0'),
+ (0x2081, 'M', u'1'),
+ (0x2082, 'M', u'2'),
+ (0x2083, 'M', u'3'),
+ (0x2084, 'M', u'4'),
+ (0x2085, 'M', u'5'),
+ (0x2086, 'M', u'6'),
+ (0x2087, 'M', u'7'),
+ (0x2088, 'M', u'8'),
+ (0x2089, 'M', u'9'),
+ (0x208A, '3', u'+'),
+ (0x208B, 'M', u'−'),
+ (0x208C, '3', u'='),
+ (0x208D, '3', u'('),
+ (0x208E, '3', u')'),
+ (0x208F, 'X'),
+ (0x2090, 'M', u'a'),
+ (0x2091, 'M', u'e'),
+ (0x2092, 'M', u'o'),
+ (0x2093, 'M', u'x'),
+ (0x2094, 'M', u'ə'),
+ (0x2095, 'M', u'h'),
+ (0x2096, 'M', u'k'),
+ (0x2097, 'M', u'l'),
+ (0x2098, 'M', u'm'),
+ (0x2099, 'M', u'n'),
+ (0x209A, 'M', u'p'),
+ (0x209B, 'M', u's'),
+ (0x209C, 'M', u't'),
+ (0x209D, 'X'),
+ (0x20A0, 'V'),
+ (0x20A8, 'M', u'rs'),
+ (0x20A9, 'V'),
+ (0x20C0, 'X'),
+ (0x20D0, 'V'),
+ (0x20F1, 'X'),
+ (0x2100, '3', u'a/c'),
+ (0x2101, '3', u'a/s'),
+ ]
+
+def _seg_22():
+ return [
+ (0x2102, 'M', u'c'),
+ (0x2103, 'M', u'°c'),
+ (0x2104, 'V'),
+ (0x2105, '3', u'c/o'),
+ (0x2106, '3', u'c/u'),
+ (0x2107, 'M', u'ɛ'),
+ (0x2108, 'V'),
+ (0x2109, 'M', u'°f'),
+ (0x210A, 'M', u'g'),
+ (0x210B, 'M', u'h'),
+ (0x210F, 'M', u'ħ'),
+ (0x2110, 'M', u'i'),
+ (0x2112, 'M', u'l'),
+ (0x2114, 'V'),
+ (0x2115, 'M', u'n'),
+ (0x2116, 'M', u'no'),
+ (0x2117, 'V'),
+ (0x2119, 'M', u'p'),
+ (0x211A, 'M', u'q'),
+ (0x211B, 'M', u'r'),
+ (0x211E, 'V'),
+ (0x2120, 'M', u'sm'),
+ (0x2121, 'M', u'tel'),
+ (0x2122, 'M', u'tm'),
+ (0x2123, 'V'),
+ (0x2124, 'M', u'z'),
+ (0x2125, 'V'),
+ (0x2126, 'M', u'ω'),
+ (0x2127, 'V'),
+ (0x2128, 'M', u'z'),
+ (0x2129, 'V'),
+ (0x212A, 'M', u'k'),
+ (0x212B, 'M', u'å'),
+ (0x212C, 'M', u'b'),
+ (0x212D, 'M', u'c'),
+ (0x212E, 'V'),
+ (0x212F, 'M', u'e'),
+ (0x2131, 'M', u'f'),
+ (0x2132, 'X'),
+ (0x2133, 'M', u'm'),
+ (0x2134, 'M', u'o'),
+ (0x2135, 'M', u'א'),
+ (0x2136, 'M', u'ב'),
+ (0x2137, 'M', u'ג'),
+ (0x2138, 'M', u'ד'),
+ (0x2139, 'M', u'i'),
+ (0x213A, 'V'),
+ (0x213B, 'M', u'fax'),
+ (0x213C, 'M', u'π'),
+ (0x213D, 'M', u'γ'),
+ (0x213F, 'M', u'π'),
+ (0x2140, 'M', u'∑'),
+ (0x2141, 'V'),
+ (0x2145, 'M', u'd'),
+ (0x2147, 'M', u'e'),
+ (0x2148, 'M', u'i'),
+ (0x2149, 'M', u'j'),
+ (0x214A, 'V'),
+ (0x2150, 'M', u'1⁄7'),
+ (0x2151, 'M', u'1⁄9'),
+ (0x2152, 'M', u'1⁄10'),
+ (0x2153, 'M', u'1⁄3'),
+ (0x2154, 'M', u'2⁄3'),
+ (0x2155, 'M', u'1⁄5'),
+ (0x2156, 'M', u'2⁄5'),
+ (0x2157, 'M', u'3⁄5'),
+ (0x2158, 'M', u'4⁄5'),
+ (0x2159, 'M', u'1⁄6'),
+ (0x215A, 'M', u'5⁄6'),
+ (0x215B, 'M', u'1⁄8'),
+ (0x215C, 'M', u'3⁄8'),
+ (0x215D, 'M', u'5⁄8'),
+ (0x215E, 'M', u'7⁄8'),
+ (0x215F, 'M', u'1⁄'),
+ (0x2160, 'M', u'i'),
+ (0x2161, 'M', u'ii'),
+ (0x2162, 'M', u'iii'),
+ (0x2163, 'M', u'iv'),
+ (0x2164, 'M', u'v'),
+ (0x2165, 'M', u'vi'),
+ (0x2166, 'M', u'vii'),
+ (0x2167, 'M', u'viii'),
+ (0x2168, 'M', u'ix'),
+ (0x2169, 'M', u'x'),
+ (0x216A, 'M', u'xi'),
+ (0x216B, 'M', u'xii'),
+ (0x216C, 'M', u'l'),
+ (0x216D, 'M', u'c'),
+ (0x216E, 'M', u'd'),
+ (0x216F, 'M', u'm'),
+ (0x2170, 'M', u'i'),
+ (0x2171, 'M', u'ii'),
+ (0x2172, 'M', u'iii'),
+ (0x2173, 'M', u'iv'),
+ (0x2174, 'M', u'v'),
+ (0x2175, 'M', u'vi'),
+ (0x2176, 'M', u'vii'),
+ (0x2177, 'M', u'viii'),
+ (0x2178, 'M', u'ix'),
+ (0x2179, 'M', u'x'),
+ ]
+
+def _seg_23():
+ return [
+ (0x217A, 'M', u'xi'),
+ (0x217B, 'M', u'xii'),
+ (0x217C, 'M', u'l'),
+ (0x217D, 'M', u'c'),
+ (0x217E, 'M', u'd'),
+ (0x217F, 'M', u'm'),
+ (0x2180, 'V'),
+ (0x2183, 'X'),
+ (0x2184, 'V'),
+ (0x2189, 'M', u'0⁄3'),
+ (0x218A, 'V'),
+ (0x218C, 'X'),
+ (0x2190, 'V'),
+ (0x222C, 'M', u'∫∫'),
+ (0x222D, 'M', u'∫∫∫'),
+ (0x222E, 'V'),
+ (0x222F, 'M', u'∮∮'),
+ (0x2230, 'M', u'∮∮∮'),
+ (0x2231, 'V'),
+ (0x2260, '3'),
+ (0x2261, 'V'),
+ (0x226E, '3'),
+ (0x2270, 'V'),
+ (0x2329, 'M', u'〈'),
+ (0x232A, 'M', u'〉'),
+ (0x232B, 'V'),
+ (0x2427, 'X'),
+ (0x2440, 'V'),
+ (0x244B, 'X'),
+ (0x2460, 'M', u'1'),
+ (0x2461, 'M', u'2'),
+ (0x2462, 'M', u'3'),
+ (0x2463, 'M', u'4'),
+ (0x2464, 'M', u'5'),
+ (0x2465, 'M', u'6'),
+ (0x2466, 'M', u'7'),
+ (0x2467, 'M', u'8'),
+ (0x2468, 'M', u'9'),
+ (0x2469, 'M', u'10'),
+ (0x246A, 'M', u'11'),
+ (0x246B, 'M', u'12'),
+ (0x246C, 'M', u'13'),
+ (0x246D, 'M', u'14'),
+ (0x246E, 'M', u'15'),
+ (0x246F, 'M', u'16'),
+ (0x2470, 'M', u'17'),
+ (0x2471, 'M', u'18'),
+ (0x2472, 'M', u'19'),
+ (0x2473, 'M', u'20'),
+ (0x2474, '3', u'(1)'),
+ (0x2475, '3', u'(2)'),
+ (0x2476, '3', u'(3)'),
+ (0x2477, '3', u'(4)'),
+ (0x2478, '3', u'(5)'),
+ (0x2479, '3', u'(6)'),
+ (0x247A, '3', u'(7)'),
+ (0x247B, '3', u'(8)'),
+ (0x247C, '3', u'(9)'),
+ (0x247D, '3', u'(10)'),
+ (0x247E, '3', u'(11)'),
+ (0x247F, '3', u'(12)'),
+ (0x2480, '3', u'(13)'),
+ (0x2481, '3', u'(14)'),
+ (0x2482, '3', u'(15)'),
+ (0x2483, '3', u'(16)'),
+ (0x2484, '3', u'(17)'),
+ (0x2485, '3', u'(18)'),
+ (0x2486, '3', u'(19)'),
+ (0x2487, '3', u'(20)'),
+ (0x2488, 'X'),
+ (0x249C, '3', u'(a)'),
+ (0x249D, '3', u'(b)'),
+ (0x249E, '3', u'(c)'),
+ (0x249F, '3', u'(d)'),
+ (0x24A0, '3', u'(e)'),
+ (0x24A1, '3', u'(f)'),
+ (0x24A2, '3', u'(g)'),
+ (0x24A3, '3', u'(h)'),
+ (0x24A4, '3', u'(i)'),
+ (0x24A5, '3', u'(j)'),
+ (0x24A6, '3', u'(k)'),
+ (0x24A7, '3', u'(l)'),
+ (0x24A8, '3', u'(m)'),
+ (0x24A9, '3', u'(n)'),
+ (0x24AA, '3', u'(o)'),
+ (0x24AB, '3', u'(p)'),
+ (0x24AC, '3', u'(q)'),
+ (0x24AD, '3', u'(r)'),
+ (0x24AE, '3', u'(s)'),
+ (0x24AF, '3', u'(t)'),
+ (0x24B0, '3', u'(u)'),
+ (0x24B1, '3', u'(v)'),
+ (0x24B2, '3', u'(w)'),
+ (0x24B3, '3', u'(x)'),
+ (0x24B4, '3', u'(y)'),
+ (0x24B5, '3', u'(z)'),
+ (0x24B6, 'M', u'a'),
+ (0x24B7, 'M', u'b'),
+ (0x24B8, 'M', u'c'),
+ (0x24B9, 'M', u'd'),
+ ]
+
+def _seg_24():
+ return [
+ (0x24BA, 'M', u'e'),
+ (0x24BB, 'M', u'f'),
+ (0x24BC, 'M', u'g'),
+ (0x24BD, 'M', u'h'),
+ (0x24BE, 'M', u'i'),
+ (0x24BF, 'M', u'j'),
+ (0x24C0, 'M', u'k'),
+ (0x24C1, 'M', u'l'),
+ (0x24C2, 'M', u'm'),
+ (0x24C3, 'M', u'n'),
+ (0x24C4, 'M', u'o'),
+ (0x24C5, 'M', u'p'),
+ (0x24C6, 'M', u'q'),
+ (0x24C7, 'M', u'r'),
+ (0x24C8, 'M', u's'),
+ (0x24C9, 'M', u't'),
+ (0x24CA, 'M', u'u'),
+ (0x24CB, 'M', u'v'),
+ (0x24CC, 'M', u'w'),
+ (0x24CD, 'M', u'x'),
+ (0x24CE, 'M', u'y'),
+ (0x24CF, 'M', u'z'),
+ (0x24D0, 'M', u'a'),
+ (0x24D1, 'M', u'b'),
+ (0x24D2, 'M', u'c'),
+ (0x24D3, 'M', u'd'),
+ (0x24D4, 'M', u'e'),
+ (0x24D5, 'M', u'f'),
+ (0x24D6, 'M', u'g'),
+ (0x24D7, 'M', u'h'),
+ (0x24D8, 'M', u'i'),
+ (0x24D9, 'M', u'j'),
+ (0x24DA, 'M', u'k'),
+ (0x24DB, 'M', u'l'),
+ (0x24DC, 'M', u'm'),
+ (0x24DD, 'M', u'n'),
+ (0x24DE, 'M', u'o'),
+ (0x24DF, 'M', u'p'),
+ (0x24E0, 'M', u'q'),
+ (0x24E1, 'M', u'r'),
+ (0x24E2, 'M', u's'),
+ (0x24E3, 'M', u't'),
+ (0x24E4, 'M', u'u'),
+ (0x24E5, 'M', u'v'),
+ (0x24E6, 'M', u'w'),
+ (0x24E7, 'M', u'x'),
+ (0x24E8, 'M', u'y'),
+ (0x24E9, 'M', u'z'),
+ (0x24EA, 'M', u'0'),
+ (0x24EB, 'V'),
+ (0x2A0C, 'M', u'∫∫∫∫'),
+ (0x2A0D, 'V'),
+ (0x2A74, '3', u'::='),
+ (0x2A75, '3', u'=='),
+ (0x2A76, '3', u'==='),
+ (0x2A77, 'V'),
+ (0x2ADC, 'M', u'⫝̸'),
+ (0x2ADD, 'V'),
+ (0x2B74, 'X'),
+ (0x2B76, 'V'),
+ (0x2B96, 'X'),
+ (0x2B97, 'V'),
+ (0x2C00, 'M', u'ⰰ'),
+ (0x2C01, 'M', u'ⰱ'),
+ (0x2C02, 'M', u'ⰲ'),
+ (0x2C03, 'M', u'ⰳ'),
+ (0x2C04, 'M', u'ⰴ'),
+ (0x2C05, 'M', u'ⰵ'),
+ (0x2C06, 'M', u'ⰶ'),
+ (0x2C07, 'M', u'ⰷ'),
+ (0x2C08, 'M', u'ⰸ'),
+ (0x2C09, 'M', u'ⰹ'),
+ (0x2C0A, 'M', u'ⰺ'),
+ (0x2C0B, 'M', u'ⰻ'),
+ (0x2C0C, 'M', u'ⰼ'),
+ (0x2C0D, 'M', u'ⰽ'),
+ (0x2C0E, 'M', u'ⰾ'),
+ (0x2C0F, 'M', u'ⰿ'),
+ (0x2C10, 'M', u'ⱀ'),
+ (0x2C11, 'M', u'ⱁ'),
+ (0x2C12, 'M', u'ⱂ'),
+ (0x2C13, 'M', u'ⱃ'),
+ (0x2C14, 'M', u'ⱄ'),
+ (0x2C15, 'M', u'ⱅ'),
+ (0x2C16, 'M', u'ⱆ'),
+ (0x2C17, 'M', u'ⱇ'),
+ (0x2C18, 'M', u'ⱈ'),
+ (0x2C19, 'M', u'ⱉ'),
+ (0x2C1A, 'M', u'ⱊ'),
+ (0x2C1B, 'M', u'ⱋ'),
+ (0x2C1C, 'M', u'ⱌ'),
+ (0x2C1D, 'M', u'ⱍ'),
+ (0x2C1E, 'M', u'ⱎ'),
+ (0x2C1F, 'M', u'ⱏ'),
+ (0x2C20, 'M', u'ⱐ'),
+ (0x2C21, 'M', u'ⱑ'),
+ (0x2C22, 'M', u'ⱒ'),
+ (0x2C23, 'M', u'ⱓ'),
+ (0x2C24, 'M', u'ⱔ'),
+ (0x2C25, 'M', u'ⱕ'),
+ ]
+
+def _seg_25():
+ return [
+ (0x2C26, 'M', u'ⱖ'),
+ (0x2C27, 'M', u'ⱗ'),
+ (0x2C28, 'M', u'ⱘ'),
+ (0x2C29, 'M', u'ⱙ'),
+ (0x2C2A, 'M', u'ⱚ'),
+ (0x2C2B, 'M', u'ⱛ'),
+ (0x2C2C, 'M', u'ⱜ'),
+ (0x2C2D, 'M', u'ⱝ'),
+ (0x2C2E, 'M', u'ⱞ'),
+ (0x2C2F, 'X'),
+ (0x2C30, 'V'),
+ (0x2C5F, 'X'),
+ (0x2C60, 'M', u'ⱡ'),
+ (0x2C61, 'V'),
+ (0x2C62, 'M', u'ɫ'),
+ (0x2C63, 'M', u'ᵽ'),
+ (0x2C64, 'M', u'ɽ'),
+ (0x2C65, 'V'),
+ (0x2C67, 'M', u'ⱨ'),
+ (0x2C68, 'V'),
+ (0x2C69, 'M', u'ⱪ'),
+ (0x2C6A, 'V'),
+ (0x2C6B, 'M', u'ⱬ'),
+ (0x2C6C, 'V'),
+ (0x2C6D, 'M', u'ɑ'),
+ (0x2C6E, 'M', u'ɱ'),
+ (0x2C6F, 'M', u'ɐ'),
+ (0x2C70, 'M', u'ɒ'),
+ (0x2C71, 'V'),
+ (0x2C72, 'M', u'ⱳ'),
+ (0x2C73, 'V'),
+ (0x2C75, 'M', u'ⱶ'),
+ (0x2C76, 'V'),
+ (0x2C7C, 'M', u'j'),
+ (0x2C7D, 'M', u'v'),
+ (0x2C7E, 'M', u'ȿ'),
+ (0x2C7F, 'M', u'ɀ'),
+ (0x2C80, 'M', u'ⲁ'),
+ (0x2C81, 'V'),
+ (0x2C82, 'M', u'ⲃ'),
+ (0x2C83, 'V'),
+ (0x2C84, 'M', u'ⲅ'),
+ (0x2C85, 'V'),
+ (0x2C86, 'M', u'ⲇ'),
+ (0x2C87, 'V'),
+ (0x2C88, 'M', u'ⲉ'),
+ (0x2C89, 'V'),
+ (0x2C8A, 'M', u'ⲋ'),
+ (0x2C8B, 'V'),
+ (0x2C8C, 'M', u'ⲍ'),
+ (0x2C8D, 'V'),
+ (0x2C8E, 'M', u'ⲏ'),
+ (0x2C8F, 'V'),
+ (0x2C90, 'M', u'ⲑ'),
+ (0x2C91, 'V'),
+ (0x2C92, 'M', u'ⲓ'),
+ (0x2C93, 'V'),
+ (0x2C94, 'M', u'ⲕ'),
+ (0x2C95, 'V'),
+ (0x2C96, 'M', u'ⲗ'),
+ (0x2C97, 'V'),
+ (0x2C98, 'M', u'ⲙ'),
+ (0x2C99, 'V'),
+ (0x2C9A, 'M', u'ⲛ'),
+ (0x2C9B, 'V'),
+ (0x2C9C, 'M', u'ⲝ'),
+ (0x2C9D, 'V'),
+ (0x2C9E, 'M', u'ⲟ'),
+ (0x2C9F, 'V'),
+ (0x2CA0, 'M', u'ⲡ'),
+ (0x2CA1, 'V'),
+ (0x2CA2, 'M', u'ⲣ'),
+ (0x2CA3, 'V'),
+ (0x2CA4, 'M', u'ⲥ'),
+ (0x2CA5, 'V'),
+ (0x2CA6, 'M', u'ⲧ'),
+ (0x2CA7, 'V'),
+ (0x2CA8, 'M', u'ⲩ'),
+ (0x2CA9, 'V'),
+ (0x2CAA, 'M', u'ⲫ'),
+ (0x2CAB, 'V'),
+ (0x2CAC, 'M', u'ⲭ'),
+ (0x2CAD, 'V'),
+ (0x2CAE, 'M', u'ⲯ'),
+ (0x2CAF, 'V'),
+ (0x2CB0, 'M', u'ⲱ'),
+ (0x2CB1, 'V'),
+ (0x2CB2, 'M', u'ⲳ'),
+ (0x2CB3, 'V'),
+ (0x2CB4, 'M', u'ⲵ'),
+ (0x2CB5, 'V'),
+ (0x2CB6, 'M', u'ⲷ'),
+ (0x2CB7, 'V'),
+ (0x2CB8, 'M', u'ⲹ'),
+ (0x2CB9, 'V'),
+ (0x2CBA, 'M', u'ⲻ'),
+ (0x2CBB, 'V'),
+ (0x2CBC, 'M', u'ⲽ'),
+ (0x2CBD, 'V'),
+ (0x2CBE, 'M', u'ⲿ'),
+ ]
+
+def _seg_26():
+ return [
+ (0x2CBF, 'V'),
+ (0x2CC0, 'M', u'ⳁ'),
+ (0x2CC1, 'V'),
+ (0x2CC2, 'M', u'ⳃ'),
+ (0x2CC3, 'V'),
+ (0x2CC4, 'M', u'ⳅ'),
+ (0x2CC5, 'V'),
+ (0x2CC6, 'M', u'ⳇ'),
+ (0x2CC7, 'V'),
+ (0x2CC8, 'M', u'ⳉ'),
+ (0x2CC9, 'V'),
+ (0x2CCA, 'M', u'ⳋ'),
+ (0x2CCB, 'V'),
+ (0x2CCC, 'M', u'ⳍ'),
+ (0x2CCD, 'V'),
+ (0x2CCE, 'M', u'ⳏ'),
+ (0x2CCF, 'V'),
+ (0x2CD0, 'M', u'ⳑ'),
+ (0x2CD1, 'V'),
+ (0x2CD2, 'M', u'ⳓ'),
+ (0x2CD3, 'V'),
+ (0x2CD4, 'M', u'ⳕ'),
+ (0x2CD5, 'V'),
+ (0x2CD6, 'M', u'ⳗ'),
+ (0x2CD7, 'V'),
+ (0x2CD8, 'M', u'ⳙ'),
+ (0x2CD9, 'V'),
+ (0x2CDA, 'M', u'ⳛ'),
+ (0x2CDB, 'V'),
+ (0x2CDC, 'M', u'ⳝ'),
+ (0x2CDD, 'V'),
+ (0x2CDE, 'M', u'ⳟ'),
+ (0x2CDF, 'V'),
+ (0x2CE0, 'M', u'ⳡ'),
+ (0x2CE1, 'V'),
+ (0x2CE2, 'M', u'ⳣ'),
+ (0x2CE3, 'V'),
+ (0x2CEB, 'M', u'ⳬ'),
+ (0x2CEC, 'V'),
+ (0x2CED, 'M', u'ⳮ'),
+ (0x2CEE, 'V'),
+ (0x2CF2, 'M', u'ⳳ'),
+ (0x2CF3, 'V'),
+ (0x2CF4, 'X'),
+ (0x2CF9, 'V'),
+ (0x2D26, 'X'),
+ (0x2D27, 'V'),
+ (0x2D28, 'X'),
+ (0x2D2D, 'V'),
+ (0x2D2E, 'X'),
+ (0x2D30, 'V'),
+ (0x2D68, 'X'),
+ (0x2D6F, 'M', u'ⵡ'),
+ (0x2D70, 'V'),
+ (0x2D71, 'X'),
+ (0x2D7F, 'V'),
+ (0x2D97, 'X'),
+ (0x2DA0, 'V'),
+ (0x2DA7, 'X'),
+ (0x2DA8, 'V'),
+ (0x2DAF, 'X'),
+ (0x2DB0, 'V'),
+ (0x2DB7, 'X'),
+ (0x2DB8, 'V'),
+ (0x2DBF, 'X'),
+ (0x2DC0, 'V'),
+ (0x2DC7, 'X'),
+ (0x2DC8, 'V'),
+ (0x2DCF, 'X'),
+ (0x2DD0, 'V'),
+ (0x2DD7, 'X'),
+ (0x2DD8, 'V'),
+ (0x2DDF, 'X'),
+ (0x2DE0, 'V'),
+ (0x2E53, 'X'),
+ (0x2E80, 'V'),
+ (0x2E9A, 'X'),
+ (0x2E9B, 'V'),
+ (0x2E9F, 'M', u'母'),
+ (0x2EA0, 'V'),
+ (0x2EF3, 'M', u'龟'),
+ (0x2EF4, 'X'),
+ (0x2F00, 'M', u'一'),
+ (0x2F01, 'M', u'丨'),
+ (0x2F02, 'M', u'丶'),
+ (0x2F03, 'M', u'丿'),
+ (0x2F04, 'M', u'乙'),
+ (0x2F05, 'M', u'亅'),
+ (0x2F06, 'M', u'二'),
+ (0x2F07, 'M', u'亠'),
+ (0x2F08, 'M', u'人'),
+ (0x2F09, 'M', u'儿'),
+ (0x2F0A, 'M', u'入'),
+ (0x2F0B, 'M', u'八'),
+ (0x2F0C, 'M', u'冂'),
+ (0x2F0D, 'M', u'冖'),
+ (0x2F0E, 'M', u'冫'),
+ (0x2F0F, 'M', u'几'),
+ (0x2F10, 'M', u'凵'),
+ (0x2F11, 'M', u'刀'),
+ ]
+
+def _seg_27():
+ return [
+ (0x2F12, 'M', u'力'),
+ (0x2F13, 'M', u'勹'),
+ (0x2F14, 'M', u'匕'),
+ (0x2F15, 'M', u'匚'),
+ (0x2F16, 'M', u'匸'),
+ (0x2F17, 'M', u'十'),
+ (0x2F18, 'M', u'卜'),
+ (0x2F19, 'M', u'卩'),
+ (0x2F1A, 'M', u'厂'),
+ (0x2F1B, 'M', u'厶'),
+ (0x2F1C, 'M', u'又'),
+ (0x2F1D, 'M', u'口'),
+ (0x2F1E, 'M', u'囗'),
+ (0x2F1F, 'M', u'土'),
+ (0x2F20, 'M', u'士'),
+ (0x2F21, 'M', u'夂'),
+ (0x2F22, 'M', u'夊'),
+ (0x2F23, 'M', u'夕'),
+ (0x2F24, 'M', u'大'),
+ (0x2F25, 'M', u'女'),
+ (0x2F26, 'M', u'子'),
+ (0x2F27, 'M', u'宀'),
+ (0x2F28, 'M', u'寸'),
+ (0x2F29, 'M', u'小'),
+ (0x2F2A, 'M', u'尢'),
+ (0x2F2B, 'M', u'尸'),
+ (0x2F2C, 'M', u'屮'),
+ (0x2F2D, 'M', u'山'),
+ (0x2F2E, 'M', u'巛'),
+ (0x2F2F, 'M', u'工'),
+ (0x2F30, 'M', u'己'),
+ (0x2F31, 'M', u'巾'),
+ (0x2F32, 'M', u'干'),
+ (0x2F33, 'M', u'幺'),
+ (0x2F34, 'M', u'广'),
+ (0x2F35, 'M', u'廴'),
+ (0x2F36, 'M', u'廾'),
+ (0x2F37, 'M', u'弋'),
+ (0x2F38, 'M', u'弓'),
+ (0x2F39, 'M', u'彐'),
+ (0x2F3A, 'M', u'彡'),
+ (0x2F3B, 'M', u'彳'),
+ (0x2F3C, 'M', u'心'),
+ (0x2F3D, 'M', u'戈'),
+ (0x2F3E, 'M', u'戶'),
+ (0x2F3F, 'M', u'手'),
+ (0x2F40, 'M', u'支'),
+ (0x2F41, 'M', u'攴'),
+ (0x2F42, 'M', u'文'),
+ (0x2F43, 'M', u'斗'),
+ (0x2F44, 'M', u'斤'),
+ (0x2F45, 'M', u'方'),
+ (0x2F46, 'M', u'无'),
+ (0x2F47, 'M', u'日'),
+ (0x2F48, 'M', u'曰'),
+ (0x2F49, 'M', u'月'),
+ (0x2F4A, 'M', u'木'),
+ (0x2F4B, 'M', u'欠'),
+ (0x2F4C, 'M', u'止'),
+ (0x2F4D, 'M', u'歹'),
+ (0x2F4E, 'M', u'殳'),
+ (0x2F4F, 'M', u'毋'),
+ (0x2F50, 'M', u'比'),
+ (0x2F51, 'M', u'毛'),
+ (0x2F52, 'M', u'氏'),
+ (0x2F53, 'M', u'气'),
+ (0x2F54, 'M', u'水'),
+ (0x2F55, 'M', u'火'),
+ (0x2F56, 'M', u'爪'),
+ (0x2F57, 'M', u'父'),
+ (0x2F58, 'M', u'爻'),
+ (0x2F59, 'M', u'爿'),
+ (0x2F5A, 'M', u'片'),
+ (0x2F5B, 'M', u'牙'),
+ (0x2F5C, 'M', u'牛'),
+ (0x2F5D, 'M', u'犬'),
+ (0x2F5E, 'M', u'玄'),
+ (0x2F5F, 'M', u'玉'),
+ (0x2F60, 'M', u'瓜'),
+ (0x2F61, 'M', u'瓦'),
+ (0x2F62, 'M', u'甘'),
+ (0x2F63, 'M', u'生'),
+ (0x2F64, 'M', u'用'),
+ (0x2F65, 'M', u'田'),
+ (0x2F66, 'M', u'疋'),
+ (0x2F67, 'M', u'疒'),
+ (0x2F68, 'M', u'癶'),
+ (0x2F69, 'M', u'白'),
+ (0x2F6A, 'M', u'皮'),
+ (0x2F6B, 'M', u'皿'),
+ (0x2F6C, 'M', u'目'),
+ (0x2F6D, 'M', u'矛'),
+ (0x2F6E, 'M', u'矢'),
+ (0x2F6F, 'M', u'石'),
+ (0x2F70, 'M', u'示'),
+ (0x2F71, 'M', u'禸'),
+ (0x2F72, 'M', u'禾'),
+ (0x2F73, 'M', u'穴'),
+ (0x2F74, 'M', u'立'),
+ (0x2F75, 'M', u'竹'),
+ ]
+
+def _seg_28():
+ return [
+ (0x2F76, 'M', u'米'),
+ (0x2F77, 'M', u'糸'),
+ (0x2F78, 'M', u'缶'),
+ (0x2F79, 'M', u'网'),
+ (0x2F7A, 'M', u'羊'),
+ (0x2F7B, 'M', u'羽'),
+ (0x2F7C, 'M', u'老'),
+ (0x2F7D, 'M', u'而'),
+ (0x2F7E, 'M', u'耒'),
+ (0x2F7F, 'M', u'耳'),
+ (0x2F80, 'M', u'聿'),
+ (0x2F81, 'M', u'肉'),
+ (0x2F82, 'M', u'臣'),
+ (0x2F83, 'M', u'自'),
+ (0x2F84, 'M', u'至'),
+ (0x2F85, 'M', u'臼'),
+ (0x2F86, 'M', u'舌'),
+ (0x2F87, 'M', u'舛'),
+ (0x2F88, 'M', u'舟'),
+ (0x2F89, 'M', u'艮'),
+ (0x2F8A, 'M', u'色'),
+ (0x2F8B, 'M', u'艸'),
+ (0x2F8C, 'M', u'虍'),
+ (0x2F8D, 'M', u'虫'),
+ (0x2F8E, 'M', u'血'),
+ (0x2F8F, 'M', u'行'),
+ (0x2F90, 'M', u'衣'),
+ (0x2F91, 'M', u'襾'),
+ (0x2F92, 'M', u'見'),
+ (0x2F93, 'M', u'角'),
+ (0x2F94, 'M', u'言'),
+ (0x2F95, 'M', u'谷'),
+ (0x2F96, 'M', u'豆'),
+ (0x2F97, 'M', u'豕'),
+ (0x2F98, 'M', u'豸'),
+ (0x2F99, 'M', u'貝'),
+ (0x2F9A, 'M', u'赤'),
+ (0x2F9B, 'M', u'走'),
+ (0x2F9C, 'M', u'足'),
+ (0x2F9D, 'M', u'身'),
+ (0x2F9E, 'M', u'車'),
+ (0x2F9F, 'M', u'辛'),
+ (0x2FA0, 'M', u'辰'),
+ (0x2FA1, 'M', u'辵'),
+ (0x2FA2, 'M', u'邑'),
+ (0x2FA3, 'M', u'酉'),
+ (0x2FA4, 'M', u'釆'),
+ (0x2FA5, 'M', u'里'),
+ (0x2FA6, 'M', u'金'),
+ (0x2FA7, 'M', u'長'),
+ (0x2FA8, 'M', u'門'),
+ (0x2FA9, 'M', u'阜'),
+ (0x2FAA, 'M', u'隶'),
+ (0x2FAB, 'M', u'隹'),
+ (0x2FAC, 'M', u'雨'),
+ (0x2FAD, 'M', u'靑'),
+ (0x2FAE, 'M', u'非'),
+ (0x2FAF, 'M', u'面'),
+ (0x2FB0, 'M', u'革'),
+ (0x2FB1, 'M', u'韋'),
+ (0x2FB2, 'M', u'韭'),
+ (0x2FB3, 'M', u'音'),
+ (0x2FB4, 'M', u'頁'),
+ (0x2FB5, 'M', u'風'),
+ (0x2FB6, 'M', u'飛'),
+ (0x2FB7, 'M', u'食'),
+ (0x2FB8, 'M', u'首'),
+ (0x2FB9, 'M', u'香'),
+ (0x2FBA, 'M', u'馬'),
+ (0x2FBB, 'M', u'骨'),
+ (0x2FBC, 'M', u'高'),
+ (0x2FBD, 'M', u'髟'),
+ (0x2FBE, 'M', u'鬥'),
+ (0x2FBF, 'M', u'鬯'),
+ (0x2FC0, 'M', u'鬲'),
+ (0x2FC1, 'M', u'鬼'),
+ (0x2FC2, 'M', u'魚'),
+ (0x2FC3, 'M', u'鳥'),
+ (0x2FC4, 'M', u'鹵'),
+ (0x2FC5, 'M', u'鹿'),
+ (0x2FC6, 'M', u'麥'),
+ (0x2FC7, 'M', u'麻'),
+ (0x2FC8, 'M', u'黃'),
+ (0x2FC9, 'M', u'黍'),
+ (0x2FCA, 'M', u'黑'),
+ (0x2FCB, 'M', u'黹'),
+ (0x2FCC, 'M', u'黽'),
+ (0x2FCD, 'M', u'鼎'),
+ (0x2FCE, 'M', u'鼓'),
+ (0x2FCF, 'M', u'鼠'),
+ (0x2FD0, 'M', u'鼻'),
+ (0x2FD1, 'M', u'齊'),
+ (0x2FD2, 'M', u'齒'),
+ (0x2FD3, 'M', u'龍'),
+ (0x2FD4, 'M', u'龜'),
+ (0x2FD5, 'M', u'龠'),
+ (0x2FD6, 'X'),
+ (0x3000, '3', u' '),
+ (0x3001, 'V'),
+ (0x3002, 'M', u'.'),
+ ]
+
+def _seg_29():
+ return [
+ (0x3003, 'V'),
+ (0x3036, 'M', u'〒'),
+ (0x3037, 'V'),
+ (0x3038, 'M', u'十'),
+ (0x3039, 'M', u'卄'),
+ (0x303A, 'M', u'卅'),
+ (0x303B, 'V'),
+ (0x3040, 'X'),
+ (0x3041, 'V'),
+ (0x3097, 'X'),
+ (0x3099, 'V'),
+ (0x309B, '3', u' ゙'),
+ (0x309C, '3', u' ゚'),
+ (0x309D, 'V'),
+ (0x309F, 'M', u'より'),
+ (0x30A0, 'V'),
+ (0x30FF, 'M', u'コト'),
+ (0x3100, 'X'),
+ (0x3105, 'V'),
+ (0x3130, 'X'),
+ (0x3131, 'M', u'ᄀ'),
+ (0x3132, 'M', u'ᄁ'),
+ (0x3133, 'M', u'ᆪ'),
+ (0x3134, 'M', u'ᄂ'),
+ (0x3135, 'M', u'ᆬ'),
+ (0x3136, 'M', u'ᆭ'),
+ (0x3137, 'M', u'ᄃ'),
+ (0x3138, 'M', u'ᄄ'),
+ (0x3139, 'M', u'ᄅ'),
+ (0x313A, 'M', u'ᆰ'),
+ (0x313B, 'M', u'ᆱ'),
+ (0x313C, 'M', u'ᆲ'),
+ (0x313D, 'M', u'ᆳ'),
+ (0x313E, 'M', u'ᆴ'),
+ (0x313F, 'M', u'ᆵ'),
+ (0x3140, 'M', u'ᄚ'),
+ (0x3141, 'M', u'ᄆ'),
+ (0x3142, 'M', u'ᄇ'),
+ (0x3143, 'M', u'ᄈ'),
+ (0x3144, 'M', u'ᄡ'),
+ (0x3145, 'M', u'ᄉ'),
+ (0x3146, 'M', u'ᄊ'),
+ (0x3147, 'M', u'ᄋ'),
+ (0x3148, 'M', u'ᄌ'),
+ (0x3149, 'M', u'ᄍ'),
+ (0x314A, 'M', u'ᄎ'),
+ (0x314B, 'M', u'ᄏ'),
+ (0x314C, 'M', u'ᄐ'),
+ (0x314D, 'M', u'ᄑ'),
+ (0x314E, 'M', u'ᄒ'),
+ (0x314F, 'M', u'ᅡ'),
+ (0x3150, 'M', u'ᅢ'),
+ (0x3151, 'M', u'ᅣ'),
+ (0x3152, 'M', u'ᅤ'),
+ (0x3153, 'M', u'ᅥ'),
+ (0x3154, 'M', u'ᅦ'),
+ (0x3155, 'M', u'ᅧ'),
+ (0x3156, 'M', u'ᅨ'),
+ (0x3157, 'M', u'ᅩ'),
+ (0x3158, 'M', u'ᅪ'),
+ (0x3159, 'M', u'ᅫ'),
+ (0x315A, 'M', u'ᅬ'),
+ (0x315B, 'M', u'ᅭ'),
+ (0x315C, 'M', u'ᅮ'),
+ (0x315D, 'M', u'ᅯ'),
+ (0x315E, 'M', u'ᅰ'),
+ (0x315F, 'M', u'ᅱ'),
+ (0x3160, 'M', u'ᅲ'),
+ (0x3161, 'M', u'ᅳ'),
+ (0x3162, 'M', u'ᅴ'),
+ (0x3163, 'M', u'ᅵ'),
+ (0x3164, 'X'),
+ (0x3165, 'M', u'ᄔ'),
+ (0x3166, 'M', u'ᄕ'),
+ (0x3167, 'M', u'ᇇ'),
+ (0x3168, 'M', u'ᇈ'),
+ (0x3169, 'M', u'ᇌ'),
+ (0x316A, 'M', u'ᇎ'),
+ (0x316B, 'M', u'ᇓ'),
+ (0x316C, 'M', u'ᇗ'),
+ (0x316D, 'M', u'ᇙ'),
+ (0x316E, 'M', u'ᄜ'),
+ (0x316F, 'M', u'ᇝ'),
+ (0x3170, 'M', u'ᇟ'),
+ (0x3171, 'M', u'ᄝ'),
+ (0x3172, 'M', u'ᄞ'),
+ (0x3173, 'M', u'ᄠ'),
+ (0x3174, 'M', u'ᄢ'),
+ (0x3175, 'M', u'ᄣ'),
+ (0x3176, 'M', u'ᄧ'),
+ (0x3177, 'M', u'ᄩ'),
+ (0x3178, 'M', u'ᄫ'),
+ (0x3179, 'M', u'ᄬ'),
+ (0x317A, 'M', u'ᄭ'),
+ (0x317B, 'M', u'ᄮ'),
+ (0x317C, 'M', u'ᄯ'),
+ (0x317D, 'M', u'ᄲ'),
+ (0x317E, 'M', u'ᄶ'),
+ (0x317F, 'M', u'ᅀ'),
+ (0x3180, 'M', u'ᅇ'),
+ ]
+
+def _seg_30():
+ return [
+ (0x3181, 'M', u'ᅌ'),
+ (0x3182, 'M', u'ᇱ'),
+ (0x3183, 'M', u'ᇲ'),
+ (0x3184, 'M', u'ᅗ'),
+ (0x3185, 'M', u'ᅘ'),
+ (0x3186, 'M', u'ᅙ'),
+ (0x3187, 'M', u'ᆄ'),
+ (0x3188, 'M', u'ᆅ'),
+ (0x3189, 'M', u'ᆈ'),
+ (0x318A, 'M', u'ᆑ'),
+ (0x318B, 'M', u'ᆒ'),
+ (0x318C, 'M', u'ᆔ'),
+ (0x318D, 'M', u'ᆞ'),
+ (0x318E, 'M', u'ᆡ'),
+ (0x318F, 'X'),
+ (0x3190, 'V'),
+ (0x3192, 'M', u'一'),
+ (0x3193, 'M', u'二'),
+ (0x3194, 'M', u'三'),
+ (0x3195, 'M', u'四'),
+ (0x3196, 'M', u'上'),
+ (0x3197, 'M', u'中'),
+ (0x3198, 'M', u'下'),
+ (0x3199, 'M', u'甲'),
+ (0x319A, 'M', u'乙'),
+ (0x319B, 'M', u'丙'),
+ (0x319C, 'M', u'丁'),
+ (0x319D, 'M', u'天'),
+ (0x319E, 'M', u'地'),
+ (0x319F, 'M', u'人'),
+ (0x31A0, 'V'),
+ (0x31E4, 'X'),
+ (0x31F0, 'V'),
+ (0x3200, '3', u'(ᄀ)'),
+ (0x3201, '3', u'(ᄂ)'),
+ (0x3202, '3', u'(ᄃ)'),
+ (0x3203, '3', u'(ᄅ)'),
+ (0x3204, '3', u'(ᄆ)'),
+ (0x3205, '3', u'(ᄇ)'),
+ (0x3206, '3', u'(ᄉ)'),
+ (0x3207, '3', u'(ᄋ)'),
+ (0x3208, '3', u'(ᄌ)'),
+ (0x3209, '3', u'(ᄎ)'),
+ (0x320A, '3', u'(ᄏ)'),
+ (0x320B, '3', u'(ᄐ)'),
+ (0x320C, '3', u'(ᄑ)'),
+ (0x320D, '3', u'(ᄒ)'),
+ (0x320E, '3', u'(가)'),
+ (0x320F, '3', u'(나)'),
+ (0x3210, '3', u'(다)'),
+ (0x3211, '3', u'(라)'),
+ (0x3212, '3', u'(마)'),
+ (0x3213, '3', u'(바)'),
+ (0x3214, '3', u'(사)'),
+ (0x3215, '3', u'(아)'),
+ (0x3216, '3', u'(자)'),
+ (0x3217, '3', u'(차)'),
+ (0x3218, '3', u'(카)'),
+ (0x3219, '3', u'(타)'),
+ (0x321A, '3', u'(파)'),
+ (0x321B, '3', u'(하)'),
+ (0x321C, '3', u'(주)'),
+ (0x321D, '3', u'(오전)'),
+ (0x321E, '3', u'(오후)'),
+ (0x321F, 'X'),
+ (0x3220, '3', u'(一)'),
+ (0x3221, '3', u'(二)'),
+ (0x3222, '3', u'(三)'),
+ (0x3223, '3', u'(四)'),
+ (0x3224, '3', u'(五)'),
+ (0x3225, '3', u'(六)'),
+ (0x3226, '3', u'(七)'),
+ (0x3227, '3', u'(八)'),
+ (0x3228, '3', u'(九)'),
+ (0x3229, '3', u'(十)'),
+ (0x322A, '3', u'(月)'),
+ (0x322B, '3', u'(火)'),
+ (0x322C, '3', u'(水)'),
+ (0x322D, '3', u'(木)'),
+ (0x322E, '3', u'(金)'),
+ (0x322F, '3', u'(土)'),
+ (0x3230, '3', u'(日)'),
+ (0x3231, '3', u'(株)'),
+ (0x3232, '3', u'(有)'),
+ (0x3233, '3', u'(社)'),
+ (0x3234, '3', u'(名)'),
+ (0x3235, '3', u'(特)'),
+ (0x3236, '3', u'(財)'),
+ (0x3237, '3', u'(祝)'),
+ (0x3238, '3', u'(労)'),
+ (0x3239, '3', u'(代)'),
+ (0x323A, '3', u'(呼)'),
+ (0x323B, '3', u'(学)'),
+ (0x323C, '3', u'(監)'),
+ (0x323D, '3', u'(企)'),
+ (0x323E, '3', u'(資)'),
+ (0x323F, '3', u'(協)'),
+ (0x3240, '3', u'(祭)'),
+ (0x3241, '3', u'(休)'),
+ (0x3242, '3', u'(自)'),
+ ]
+
+def _seg_31():
+ return [
+ (0x3243, '3', u'(至)'),
+ (0x3244, 'M', u'問'),
+ (0x3245, 'M', u'幼'),
+ (0x3246, 'M', u'文'),
+ (0x3247, 'M', u'箏'),
+ (0x3248, 'V'),
+ (0x3250, 'M', u'pte'),
+ (0x3251, 'M', u'21'),
+ (0x3252, 'M', u'22'),
+ (0x3253, 'M', u'23'),
+ (0x3254, 'M', u'24'),
+ (0x3255, 'M', u'25'),
+ (0x3256, 'M', u'26'),
+ (0x3257, 'M', u'27'),
+ (0x3258, 'M', u'28'),
+ (0x3259, 'M', u'29'),
+ (0x325A, 'M', u'30'),
+ (0x325B, 'M', u'31'),
+ (0x325C, 'M', u'32'),
+ (0x325D, 'M', u'33'),
+ (0x325E, 'M', u'34'),
+ (0x325F, 'M', u'35'),
+ (0x3260, 'M', u'ᄀ'),
+ (0x3261, 'M', u'ᄂ'),
+ (0x3262, 'M', u'ᄃ'),
+ (0x3263, 'M', u'ᄅ'),
+ (0x3264, 'M', u'ᄆ'),
+ (0x3265, 'M', u'ᄇ'),
+ (0x3266, 'M', u'ᄉ'),
+ (0x3267, 'M', u'ᄋ'),
+ (0x3268, 'M', u'ᄌ'),
+ (0x3269, 'M', u'ᄎ'),
+ (0x326A, 'M', u'ᄏ'),
+ (0x326B, 'M', u'ᄐ'),
+ (0x326C, 'M', u'ᄑ'),
+ (0x326D, 'M', u'ᄒ'),
+ (0x326E, 'M', u'가'),
+ (0x326F, 'M', u'나'),
+ (0x3270, 'M', u'다'),
+ (0x3271, 'M', u'라'),
+ (0x3272, 'M', u'마'),
+ (0x3273, 'M', u'바'),
+ (0x3274, 'M', u'사'),
+ (0x3275, 'M', u'아'),
+ (0x3276, 'M', u'자'),
+ (0x3277, 'M', u'차'),
+ (0x3278, 'M', u'카'),
+ (0x3279, 'M', u'타'),
+ (0x327A, 'M', u'파'),
+ (0x327B, 'M', u'하'),
+ (0x327C, 'M', u'참고'),
+ (0x327D, 'M', u'주의'),
+ (0x327E, 'M', u'우'),
+ (0x327F, 'V'),
+ (0x3280, 'M', u'一'),
+ (0x3281, 'M', u'二'),
+ (0x3282, 'M', u'三'),
+ (0x3283, 'M', u'四'),
+ (0x3284, 'M', u'五'),
+ (0x3285, 'M', u'六'),
+ (0x3286, 'M', u'七'),
+ (0x3287, 'M', u'八'),
+ (0x3288, 'M', u'九'),
+ (0x3289, 'M', u'十'),
+ (0x328A, 'M', u'月'),
+ (0x328B, 'M', u'火'),
+ (0x328C, 'M', u'水'),
+ (0x328D, 'M', u'木'),
+ (0x328E, 'M', u'金'),
+ (0x328F, 'M', u'土'),
+ (0x3290, 'M', u'日'),
+ (0x3291, 'M', u'株'),
+ (0x3292, 'M', u'有'),
+ (0x3293, 'M', u'社'),
+ (0x3294, 'M', u'名'),
+ (0x3295, 'M', u'特'),
+ (0x3296, 'M', u'財'),
+ (0x3297, 'M', u'祝'),
+ (0x3298, 'M', u'労'),
+ (0x3299, 'M', u'秘'),
+ (0x329A, 'M', u'男'),
+ (0x329B, 'M', u'女'),
+ (0x329C, 'M', u'適'),
+ (0x329D, 'M', u'優'),
+ (0x329E, 'M', u'印'),
+ (0x329F, 'M', u'注'),
+ (0x32A0, 'M', u'項'),
+ (0x32A1, 'M', u'休'),
+ (0x32A2, 'M', u'写'),
+ (0x32A3, 'M', u'正'),
+ (0x32A4, 'M', u'上'),
+ (0x32A5, 'M', u'中'),
+ (0x32A6, 'M', u'下'),
+ (0x32A7, 'M', u'左'),
+ (0x32A8, 'M', u'右'),
+ (0x32A9, 'M', u'医'),
+ (0x32AA, 'M', u'宗'),
+ (0x32AB, 'M', u'学'),
+ (0x32AC, 'M', u'監'),
+ (0x32AD, 'M', u'企'),
+ ]
+
+def _seg_32():
+ return [
+ (0x32AE, 'M', u'資'),
+ (0x32AF, 'M', u'協'),
+ (0x32B0, 'M', u'夜'),
+ (0x32B1, 'M', u'36'),
+ (0x32B2, 'M', u'37'),
+ (0x32B3, 'M', u'38'),
+ (0x32B4, 'M', u'39'),
+ (0x32B5, 'M', u'40'),
+ (0x32B6, 'M', u'41'),
+ (0x32B7, 'M', u'42'),
+ (0x32B8, 'M', u'43'),
+ (0x32B9, 'M', u'44'),
+ (0x32BA, 'M', u'45'),
+ (0x32BB, 'M', u'46'),
+ (0x32BC, 'M', u'47'),
+ (0x32BD, 'M', u'48'),
+ (0x32BE, 'M', u'49'),
+ (0x32BF, 'M', u'50'),
+ (0x32C0, 'M', u'1月'),
+ (0x32C1, 'M', u'2月'),
+ (0x32C2, 'M', u'3月'),
+ (0x32C3, 'M', u'4月'),
+ (0x32C4, 'M', u'5月'),
+ (0x32C5, 'M', u'6月'),
+ (0x32C6, 'M', u'7月'),
+ (0x32C7, 'M', u'8月'),
+ (0x32C8, 'M', u'9月'),
+ (0x32C9, 'M', u'10月'),
+ (0x32CA, 'M', u'11月'),
+ (0x32CB, 'M', u'12月'),
+ (0x32CC, 'M', u'hg'),
+ (0x32CD, 'M', u'erg'),
+ (0x32CE, 'M', u'ev'),
+ (0x32CF, 'M', u'ltd'),
+ (0x32D0, 'M', u'ア'),
+ (0x32D1, 'M', u'イ'),
+ (0x32D2, 'M', u'ウ'),
+ (0x32D3, 'M', u'エ'),
+ (0x32D4, 'M', u'オ'),
+ (0x32D5, 'M', u'カ'),
+ (0x32D6, 'M', u'キ'),
+ (0x32D7, 'M', u'ク'),
+ (0x32D8, 'M', u'ケ'),
+ (0x32D9, 'M', u'コ'),
+ (0x32DA, 'M', u'サ'),
+ (0x32DB, 'M', u'シ'),
+ (0x32DC, 'M', u'ス'),
+ (0x32DD, 'M', u'セ'),
+ (0x32DE, 'M', u'ソ'),
+ (0x32DF, 'M', u'タ'),
+ (0x32E0, 'M', u'チ'),
+ (0x32E1, 'M', u'ツ'),
+ (0x32E2, 'M', u'テ'),
+ (0x32E3, 'M', u'ト'),
+ (0x32E4, 'M', u'ナ'),
+ (0x32E5, 'M', u'ニ'),
+ (0x32E6, 'M', u'ヌ'),
+ (0x32E7, 'M', u'ネ'),
+ (0x32E8, 'M', u'ノ'),
+ (0x32E9, 'M', u'ハ'),
+ (0x32EA, 'M', u'ヒ'),
+ (0x32EB, 'M', u'フ'),
+ (0x32EC, 'M', u'ヘ'),
+ (0x32ED, 'M', u'ホ'),
+ (0x32EE, 'M', u'マ'),
+ (0x32EF, 'M', u'ミ'),
+ (0x32F0, 'M', u'ム'),
+ (0x32F1, 'M', u'メ'),
+ (0x32F2, 'M', u'モ'),
+ (0x32F3, 'M', u'ヤ'),
+ (0x32F4, 'M', u'ユ'),
+ (0x32F5, 'M', u'ヨ'),
+ (0x32F6, 'M', u'ラ'),
+ (0x32F7, 'M', u'リ'),
+ (0x32F8, 'M', u'ル'),
+ (0x32F9, 'M', u'レ'),
+ (0x32FA, 'M', u'ロ'),
+ (0x32FB, 'M', u'ワ'),
+ (0x32FC, 'M', u'ヰ'),
+ (0x32FD, 'M', u'ヱ'),
+ (0x32FE, 'M', u'ヲ'),
+ (0x32FF, 'M', u'令和'),
+ (0x3300, 'M', u'アパート'),
+ (0x3301, 'M', u'アルファ'),
+ (0x3302, 'M', u'アンペア'),
+ (0x3303, 'M', u'アール'),
+ (0x3304, 'M', u'イニング'),
+ (0x3305, 'M', u'インチ'),
+ (0x3306, 'M', u'ウォン'),
+ (0x3307, 'M', u'エスクード'),
+ (0x3308, 'M', u'エーカー'),
+ (0x3309, 'M', u'オンス'),
+ (0x330A, 'M', u'オーム'),
+ (0x330B, 'M', u'カイリ'),
+ (0x330C, 'M', u'カラット'),
+ (0x330D, 'M', u'カロリー'),
+ (0x330E, 'M', u'ガロン'),
+ (0x330F, 'M', u'ガンマ'),
+ (0x3310, 'M', u'ギガ'),
+ (0x3311, 'M', u'ギニー'),
+ ]
+
+def _seg_33():
+ return [
+ (0x3312, 'M', u'キュリー'),
+ (0x3313, 'M', u'ギルダー'),
+ (0x3314, 'M', u'キロ'),
+ (0x3315, 'M', u'キログラム'),
+ (0x3316, 'M', u'キロメートル'),
+ (0x3317, 'M', u'キロワット'),
+ (0x3318, 'M', u'グラム'),
+ (0x3319, 'M', u'グラムトン'),
+ (0x331A, 'M', u'クルゼイロ'),
+ (0x331B, 'M', u'クローネ'),
+ (0x331C, 'M', u'ケース'),
+ (0x331D, 'M', u'コルナ'),
+ (0x331E, 'M', u'コーポ'),
+ (0x331F, 'M', u'サイクル'),
+ (0x3320, 'M', u'サンチーム'),
+ (0x3321, 'M', u'シリング'),
+ (0x3322, 'M', u'センチ'),
+ (0x3323, 'M', u'セント'),
+ (0x3324, 'M', u'ダース'),
+ (0x3325, 'M', u'デシ'),
+ (0x3326, 'M', u'ドル'),
+ (0x3327, 'M', u'トン'),
+ (0x3328, 'M', u'ナノ'),
+ (0x3329, 'M', u'ノット'),
+ (0x332A, 'M', u'ハイツ'),
+ (0x332B, 'M', u'パーセント'),
+ (0x332C, 'M', u'パーツ'),
+ (0x332D, 'M', u'バーレル'),
+ (0x332E, 'M', u'ピアストル'),
+ (0x332F, 'M', u'ピクル'),
+ (0x3330, 'M', u'ピコ'),
+ (0x3331, 'M', u'ビル'),
+ (0x3332, 'M', u'ファラッド'),
+ (0x3333, 'M', u'フィート'),
+ (0x3334, 'M', u'ブッシェル'),
+ (0x3335, 'M', u'フラン'),
+ (0x3336, 'M', u'ヘクタール'),
+ (0x3337, 'M', u'ペソ'),
+ (0x3338, 'M', u'ペニヒ'),
+ (0x3339, 'M', u'ヘルツ'),
+ (0x333A, 'M', u'ペンス'),
+ (0x333B, 'M', u'ページ'),
+ (0x333C, 'M', u'ベータ'),
+ (0x333D, 'M', u'ポイント'),
+ (0x333E, 'M', u'ボルト'),
+ (0x333F, 'M', u'ホン'),
+ (0x3340, 'M', u'ポンド'),
+ (0x3341, 'M', u'ホール'),
+ (0x3342, 'M', u'ホーン'),
+ (0x3343, 'M', u'マイクロ'),
+ (0x3344, 'M', u'マイル'),
+ (0x3345, 'M', u'マッハ'),
+ (0x3346, 'M', u'マルク'),
+ (0x3347, 'M', u'マンション'),
+ (0x3348, 'M', u'ミクロン'),
+ (0x3349, 'M', u'ミリ'),
+ (0x334A, 'M', u'ミリバール'),
+ (0x334B, 'M', u'メガ'),
+ (0x334C, 'M', u'メガトン'),
+ (0x334D, 'M', u'メートル'),
+ (0x334E, 'M', u'ヤード'),
+ (0x334F, 'M', u'ヤール'),
+ (0x3350, 'M', u'ユアン'),
+ (0x3351, 'M', u'リットル'),
+ (0x3352, 'M', u'リラ'),
+ (0x3353, 'M', u'ルピー'),
+ (0x3354, 'M', u'ルーブル'),
+ (0x3355, 'M', u'レム'),
+ (0x3356, 'M', u'レントゲン'),
+ (0x3357, 'M', u'ワット'),
+ (0x3358, 'M', u'0点'),
+ (0x3359, 'M', u'1点'),
+ (0x335A, 'M', u'2点'),
+ (0x335B, 'M', u'3点'),
+ (0x335C, 'M', u'4点'),
+ (0x335D, 'M', u'5点'),
+ (0x335E, 'M', u'6点'),
+ (0x335F, 'M', u'7点'),
+ (0x3360, 'M', u'8点'),
+ (0x3361, 'M', u'9点'),
+ (0x3362, 'M', u'10点'),
+ (0x3363, 'M', u'11点'),
+ (0x3364, 'M', u'12点'),
+ (0x3365, 'M', u'13点'),
+ (0x3366, 'M', u'14点'),
+ (0x3367, 'M', u'15点'),
+ (0x3368, 'M', u'16点'),
+ (0x3369, 'M', u'17点'),
+ (0x336A, 'M', u'18点'),
+ (0x336B, 'M', u'19点'),
+ (0x336C, 'M', u'20点'),
+ (0x336D, 'M', u'21点'),
+ (0x336E, 'M', u'22点'),
+ (0x336F, 'M', u'23点'),
+ (0x3370, 'M', u'24点'),
+ (0x3371, 'M', u'hpa'),
+ (0x3372, 'M', u'da'),
+ (0x3373, 'M', u'au'),
+ (0x3374, 'M', u'bar'),
+ (0x3375, 'M', u'ov'),
+ ]
+
+def _seg_34():
+ return [
+ (0x3376, 'M', u'pc'),
+ (0x3377, 'M', u'dm'),
+ (0x3378, 'M', u'dm2'),
+ (0x3379, 'M', u'dm3'),
+ (0x337A, 'M', u'iu'),
+ (0x337B, 'M', u'平成'),
+ (0x337C, 'M', u'昭和'),
+ (0x337D, 'M', u'大正'),
+ (0x337E, 'M', u'明治'),
+ (0x337F, 'M', u'株式会社'),
+ (0x3380, 'M', u'pa'),
+ (0x3381, 'M', u'na'),
+ (0x3382, 'M', u'μa'),
+ (0x3383, 'M', u'ma'),
+ (0x3384, 'M', u'ka'),
+ (0x3385, 'M', u'kb'),
+ (0x3386, 'M', u'mb'),
+ (0x3387, 'M', u'gb'),
+ (0x3388, 'M', u'cal'),
+ (0x3389, 'M', u'kcal'),
+ (0x338A, 'M', u'pf'),
+ (0x338B, 'M', u'nf'),
+ (0x338C, 'M', u'μf'),
+ (0x338D, 'M', u'μg'),
+ (0x338E, 'M', u'mg'),
+ (0x338F, 'M', u'kg'),
+ (0x3390, 'M', u'hz'),
+ (0x3391, 'M', u'khz'),
+ (0x3392, 'M', u'mhz'),
+ (0x3393, 'M', u'ghz'),
+ (0x3394, 'M', u'thz'),
+ (0x3395, 'M', u'μl'),
+ (0x3396, 'M', u'ml'),
+ (0x3397, 'M', u'dl'),
+ (0x3398, 'M', u'kl'),
+ (0x3399, 'M', u'fm'),
+ (0x339A, 'M', u'nm'),
+ (0x339B, 'M', u'μm'),
+ (0x339C, 'M', u'mm'),
+ (0x339D, 'M', u'cm'),
+ (0x339E, 'M', u'km'),
+ (0x339F, 'M', u'mm2'),
+ (0x33A0, 'M', u'cm2'),
+ (0x33A1, 'M', u'm2'),
+ (0x33A2, 'M', u'km2'),
+ (0x33A3, 'M', u'mm3'),
+ (0x33A4, 'M', u'cm3'),
+ (0x33A5, 'M', u'm3'),
+ (0x33A6, 'M', u'km3'),
+ (0x33A7, 'M', u'm∕s'),
+ (0x33A8, 'M', u'm∕s2'),
+ (0x33A9, 'M', u'pa'),
+ (0x33AA, 'M', u'kpa'),
+ (0x33AB, 'M', u'mpa'),
+ (0x33AC, 'M', u'gpa'),
+ (0x33AD, 'M', u'rad'),
+ (0x33AE, 'M', u'rad∕s'),
+ (0x33AF, 'M', u'rad∕s2'),
+ (0x33B0, 'M', u'ps'),
+ (0x33B1, 'M', u'ns'),
+ (0x33B2, 'M', u'μs'),
+ (0x33B3, 'M', u'ms'),
+ (0x33B4, 'M', u'pv'),
+ (0x33B5, 'M', u'nv'),
+ (0x33B6, 'M', u'μv'),
+ (0x33B7, 'M', u'mv'),
+ (0x33B8, 'M', u'kv'),
+ (0x33B9, 'M', u'mv'),
+ (0x33BA, 'M', u'pw'),
+ (0x33BB, 'M', u'nw'),
+ (0x33BC, 'M', u'μw'),
+ (0x33BD, 'M', u'mw'),
+ (0x33BE, 'M', u'kw'),
+ (0x33BF, 'M', u'mw'),
+ (0x33C0, 'M', u'kω'),
+ (0x33C1, 'M', u'mω'),
+ (0x33C2, 'X'),
+ (0x33C3, 'M', u'bq'),
+ (0x33C4, 'M', u'cc'),
+ (0x33C5, 'M', u'cd'),
+ (0x33C6, 'M', u'c∕kg'),
+ (0x33C7, 'X'),
+ (0x33C8, 'M', u'db'),
+ (0x33C9, 'M', u'gy'),
+ (0x33CA, 'M', u'ha'),
+ (0x33CB, 'M', u'hp'),
+ (0x33CC, 'M', u'in'),
+ (0x33CD, 'M', u'kk'),
+ (0x33CE, 'M', u'km'),
+ (0x33CF, 'M', u'kt'),
+ (0x33D0, 'M', u'lm'),
+ (0x33D1, 'M', u'ln'),
+ (0x33D2, 'M', u'log'),
+ (0x33D3, 'M', u'lx'),
+ (0x33D4, 'M', u'mb'),
+ (0x33D5, 'M', u'mil'),
+ (0x33D6, 'M', u'mol'),
+ (0x33D7, 'M', u'ph'),
+ (0x33D8, 'X'),
+ (0x33D9, 'M', u'ppm'),
+ ]
+
+def _seg_35():
+ return [
+ (0x33DA, 'M', u'pr'),
+ (0x33DB, 'M', u'sr'),
+ (0x33DC, 'M', u'sv'),
+ (0x33DD, 'M', u'wb'),
+ (0x33DE, 'M', u'v∕m'),
+ (0x33DF, 'M', u'a∕m'),
+ (0x33E0, 'M', u'1日'),
+ (0x33E1, 'M', u'2日'),
+ (0x33E2, 'M', u'3日'),
+ (0x33E3, 'M', u'4日'),
+ (0x33E4, 'M', u'5日'),
+ (0x33E5, 'M', u'6日'),
+ (0x33E6, 'M', u'7日'),
+ (0x33E7, 'M', u'8日'),
+ (0x33E8, 'M', u'9日'),
+ (0x33E9, 'M', u'10日'),
+ (0x33EA, 'M', u'11日'),
+ (0x33EB, 'M', u'12日'),
+ (0x33EC, 'M', u'13日'),
+ (0x33ED, 'M', u'14日'),
+ (0x33EE, 'M', u'15日'),
+ (0x33EF, 'M', u'16日'),
+ (0x33F0, 'M', u'17日'),
+ (0x33F1, 'M', u'18日'),
+ (0x33F2, 'M', u'19日'),
+ (0x33F3, 'M', u'20日'),
+ (0x33F4, 'M', u'21日'),
+ (0x33F5, 'M', u'22日'),
+ (0x33F6, 'M', u'23日'),
+ (0x33F7, 'M', u'24日'),
+ (0x33F8, 'M', u'25日'),
+ (0x33F9, 'M', u'26日'),
+ (0x33FA, 'M', u'27日'),
+ (0x33FB, 'M', u'28日'),
+ (0x33FC, 'M', u'29日'),
+ (0x33FD, 'M', u'30日'),
+ (0x33FE, 'M', u'31日'),
+ (0x33FF, 'M', u'gal'),
+ (0x3400, 'V'),
+ (0x9FFD, 'X'),
+ (0xA000, 'V'),
+ (0xA48D, 'X'),
+ (0xA490, 'V'),
+ (0xA4C7, 'X'),
+ (0xA4D0, 'V'),
+ (0xA62C, 'X'),
+ (0xA640, 'M', u'ꙁ'),
+ (0xA641, 'V'),
+ (0xA642, 'M', u'ꙃ'),
+ (0xA643, 'V'),
+ (0xA644, 'M', u'ꙅ'),
+ (0xA645, 'V'),
+ (0xA646, 'M', u'ꙇ'),
+ (0xA647, 'V'),
+ (0xA648, 'M', u'ꙉ'),
+ (0xA649, 'V'),
+ (0xA64A, 'M', u'ꙋ'),
+ (0xA64B, 'V'),
+ (0xA64C, 'M', u'ꙍ'),
+ (0xA64D, 'V'),
+ (0xA64E, 'M', u'ꙏ'),
+ (0xA64F, 'V'),
+ (0xA650, 'M', u'ꙑ'),
+ (0xA651, 'V'),
+ (0xA652, 'M', u'ꙓ'),
+ (0xA653, 'V'),
+ (0xA654, 'M', u'ꙕ'),
+ (0xA655, 'V'),
+ (0xA656, 'M', u'ꙗ'),
+ (0xA657, 'V'),
+ (0xA658, 'M', u'ꙙ'),
+ (0xA659, 'V'),
+ (0xA65A, 'M', u'ꙛ'),
+ (0xA65B, 'V'),
+ (0xA65C, 'M', u'ꙝ'),
+ (0xA65D, 'V'),
+ (0xA65E, 'M', u'ꙟ'),
+ (0xA65F, 'V'),
+ (0xA660, 'M', u'ꙡ'),
+ (0xA661, 'V'),
+ (0xA662, 'M', u'ꙣ'),
+ (0xA663, 'V'),
+ (0xA664, 'M', u'ꙥ'),
+ (0xA665, 'V'),
+ (0xA666, 'M', u'ꙧ'),
+ (0xA667, 'V'),
+ (0xA668, 'M', u'ꙩ'),
+ (0xA669, 'V'),
+ (0xA66A, 'M', u'ꙫ'),
+ (0xA66B, 'V'),
+ (0xA66C, 'M', u'ꙭ'),
+ (0xA66D, 'V'),
+ (0xA680, 'M', u'ꚁ'),
+ (0xA681, 'V'),
+ (0xA682, 'M', u'ꚃ'),
+ (0xA683, 'V'),
+ (0xA684, 'M', u'ꚅ'),
+ (0xA685, 'V'),
+ (0xA686, 'M', u'ꚇ'),
+ (0xA687, 'V'),
+ ]
+
+def _seg_36():
+ return [
+ (0xA688, 'M', u'ꚉ'),
+ (0xA689, 'V'),
+ (0xA68A, 'M', u'ꚋ'),
+ (0xA68B, 'V'),
+ (0xA68C, 'M', u'ꚍ'),
+ (0xA68D, 'V'),
+ (0xA68E, 'M', u'ꚏ'),
+ (0xA68F, 'V'),
+ (0xA690, 'M', u'ꚑ'),
+ (0xA691, 'V'),
+ (0xA692, 'M', u'ꚓ'),
+ (0xA693, 'V'),
+ (0xA694, 'M', u'ꚕ'),
+ (0xA695, 'V'),
+ (0xA696, 'M', u'ꚗ'),
+ (0xA697, 'V'),
+ (0xA698, 'M', u'ꚙ'),
+ (0xA699, 'V'),
+ (0xA69A, 'M', u'ꚛ'),
+ (0xA69B, 'V'),
+ (0xA69C, 'M', u'ъ'),
+ (0xA69D, 'M', u'ь'),
+ (0xA69E, 'V'),
+ (0xA6F8, 'X'),
+ (0xA700, 'V'),
+ (0xA722, 'M', u'ꜣ'),
+ (0xA723, 'V'),
+ (0xA724, 'M', u'ꜥ'),
+ (0xA725, 'V'),
+ (0xA726, 'M', u'ꜧ'),
+ (0xA727, 'V'),
+ (0xA728, 'M', u'ꜩ'),
+ (0xA729, 'V'),
+ (0xA72A, 'M', u'ꜫ'),
+ (0xA72B, 'V'),
+ (0xA72C, 'M', u'ꜭ'),
+ (0xA72D, 'V'),
+ (0xA72E, 'M', u'ꜯ'),
+ (0xA72F, 'V'),
+ (0xA732, 'M', u'ꜳ'),
+ (0xA733, 'V'),
+ (0xA734, 'M', u'ꜵ'),
+ (0xA735, 'V'),
+ (0xA736, 'M', u'ꜷ'),
+ (0xA737, 'V'),
+ (0xA738, 'M', u'ꜹ'),
+ (0xA739, 'V'),
+ (0xA73A, 'M', u'ꜻ'),
+ (0xA73B, 'V'),
+ (0xA73C, 'M', u'ꜽ'),
+ (0xA73D, 'V'),
+ (0xA73E, 'M', u'ꜿ'),
+ (0xA73F, 'V'),
+ (0xA740, 'M', u'ꝁ'),
+ (0xA741, 'V'),
+ (0xA742, 'M', u'ꝃ'),
+ (0xA743, 'V'),
+ (0xA744, 'M', u'ꝅ'),
+ (0xA745, 'V'),
+ (0xA746, 'M', u'ꝇ'),
+ (0xA747, 'V'),
+ (0xA748, 'M', u'ꝉ'),
+ (0xA749, 'V'),
+ (0xA74A, 'M', u'ꝋ'),
+ (0xA74B, 'V'),
+ (0xA74C, 'M', u'ꝍ'),
+ (0xA74D, 'V'),
+ (0xA74E, 'M', u'ꝏ'),
+ (0xA74F, 'V'),
+ (0xA750, 'M', u'ꝑ'),
+ (0xA751, 'V'),
+ (0xA752, 'M', u'ꝓ'),
+ (0xA753, 'V'),
+ (0xA754, 'M', u'ꝕ'),
+ (0xA755, 'V'),
+ (0xA756, 'M', u'ꝗ'),
+ (0xA757, 'V'),
+ (0xA758, 'M', u'ꝙ'),
+ (0xA759, 'V'),
+ (0xA75A, 'M', u'ꝛ'),
+ (0xA75B, 'V'),
+ (0xA75C, 'M', u'ꝝ'),
+ (0xA75D, 'V'),
+ (0xA75E, 'M', u'ꝟ'),
+ (0xA75F, 'V'),
+ (0xA760, 'M', u'ꝡ'),
+ (0xA761, 'V'),
+ (0xA762, 'M', u'ꝣ'),
+ (0xA763, 'V'),
+ (0xA764, 'M', u'ꝥ'),
+ (0xA765, 'V'),
+ (0xA766, 'M', u'ꝧ'),
+ (0xA767, 'V'),
+ (0xA768, 'M', u'ꝩ'),
+ (0xA769, 'V'),
+ (0xA76A, 'M', u'ꝫ'),
+ (0xA76B, 'V'),
+ (0xA76C, 'M', u'ꝭ'),
+ (0xA76D, 'V'),
+ (0xA76E, 'M', u'ꝯ'),
+ ]
+
+def _seg_37():
+ return [
+ (0xA76F, 'V'),
+ (0xA770, 'M', u'ꝯ'),
+ (0xA771, 'V'),
+ (0xA779, 'M', u'ꝺ'),
+ (0xA77A, 'V'),
+ (0xA77B, 'M', u'ꝼ'),
+ (0xA77C, 'V'),
+ (0xA77D, 'M', u'ᵹ'),
+ (0xA77E, 'M', u'ꝿ'),
+ (0xA77F, 'V'),
+ (0xA780, 'M', u'ꞁ'),
+ (0xA781, 'V'),
+ (0xA782, 'M', u'ꞃ'),
+ (0xA783, 'V'),
+ (0xA784, 'M', u'ꞅ'),
+ (0xA785, 'V'),
+ (0xA786, 'M', u'ꞇ'),
+ (0xA787, 'V'),
+ (0xA78B, 'M', u'ꞌ'),
+ (0xA78C, 'V'),
+ (0xA78D, 'M', u'ɥ'),
+ (0xA78E, 'V'),
+ (0xA790, 'M', u'ꞑ'),
+ (0xA791, 'V'),
+ (0xA792, 'M', u'ꞓ'),
+ (0xA793, 'V'),
+ (0xA796, 'M', u'ꞗ'),
+ (0xA797, 'V'),
+ (0xA798, 'M', u'ꞙ'),
+ (0xA799, 'V'),
+ (0xA79A, 'M', u'ꞛ'),
+ (0xA79B, 'V'),
+ (0xA79C, 'M', u'ꞝ'),
+ (0xA79D, 'V'),
+ (0xA79E, 'M', u'ꞟ'),
+ (0xA79F, 'V'),
+ (0xA7A0, 'M', u'ꞡ'),
+ (0xA7A1, 'V'),
+ (0xA7A2, 'M', u'ꞣ'),
+ (0xA7A3, 'V'),
+ (0xA7A4, 'M', u'ꞥ'),
+ (0xA7A5, 'V'),
+ (0xA7A6, 'M', u'ꞧ'),
+ (0xA7A7, 'V'),
+ (0xA7A8, 'M', u'ꞩ'),
+ (0xA7A9, 'V'),
+ (0xA7AA, 'M', u'ɦ'),
+ (0xA7AB, 'M', u'ɜ'),
+ (0xA7AC, 'M', u'ɡ'),
+ (0xA7AD, 'M', u'ɬ'),
+ (0xA7AE, 'M', u'ɪ'),
+ (0xA7AF, 'V'),
+ (0xA7B0, 'M', u'ʞ'),
+ (0xA7B1, 'M', u'ʇ'),
+ (0xA7B2, 'M', u'ʝ'),
+ (0xA7B3, 'M', u'ꭓ'),
+ (0xA7B4, 'M', u'ꞵ'),
+ (0xA7B5, 'V'),
+ (0xA7B6, 'M', u'ꞷ'),
+ (0xA7B7, 'V'),
+ (0xA7B8, 'M', u'ꞹ'),
+ (0xA7B9, 'V'),
+ (0xA7BA, 'M', u'ꞻ'),
+ (0xA7BB, 'V'),
+ (0xA7BC, 'M', u'ꞽ'),
+ (0xA7BD, 'V'),
+ (0xA7BE, 'M', u'ꞿ'),
+ (0xA7BF, 'V'),
+ (0xA7C0, 'X'),
+ (0xA7C2, 'M', u'ꟃ'),
+ (0xA7C3, 'V'),
+ (0xA7C4, 'M', u'ꞔ'),
+ (0xA7C5, 'M', u'ʂ'),
+ (0xA7C6, 'M', u'ᶎ'),
+ (0xA7C7, 'M', u'ꟈ'),
+ (0xA7C8, 'V'),
+ (0xA7C9, 'M', u'ꟊ'),
+ (0xA7CA, 'V'),
+ (0xA7CB, 'X'),
+ (0xA7F5, 'M', u'ꟶ'),
+ (0xA7F6, 'V'),
+ (0xA7F8, 'M', u'ħ'),
+ (0xA7F9, 'M', u'œ'),
+ (0xA7FA, 'V'),
+ (0xA82D, 'X'),
+ (0xA830, 'V'),
+ (0xA83A, 'X'),
+ (0xA840, 'V'),
+ (0xA878, 'X'),
+ (0xA880, 'V'),
+ (0xA8C6, 'X'),
+ (0xA8CE, 'V'),
+ (0xA8DA, 'X'),
+ (0xA8E0, 'V'),
+ (0xA954, 'X'),
+ (0xA95F, 'V'),
+ (0xA97D, 'X'),
+ (0xA980, 'V'),
+ (0xA9CE, 'X'),
+ (0xA9CF, 'V'),
+ ]
+
+def _seg_38():
+ return [
+ (0xA9DA, 'X'),
+ (0xA9DE, 'V'),
+ (0xA9FF, 'X'),
+ (0xAA00, 'V'),
+ (0xAA37, 'X'),
+ (0xAA40, 'V'),
+ (0xAA4E, 'X'),
+ (0xAA50, 'V'),
+ (0xAA5A, 'X'),
+ (0xAA5C, 'V'),
+ (0xAAC3, 'X'),
+ (0xAADB, 'V'),
+ (0xAAF7, 'X'),
+ (0xAB01, 'V'),
+ (0xAB07, 'X'),
+ (0xAB09, 'V'),
+ (0xAB0F, 'X'),
+ (0xAB11, 'V'),
+ (0xAB17, 'X'),
+ (0xAB20, 'V'),
+ (0xAB27, 'X'),
+ (0xAB28, 'V'),
+ (0xAB2F, 'X'),
+ (0xAB30, 'V'),
+ (0xAB5C, 'M', u'ꜧ'),
+ (0xAB5D, 'M', u'ꬷ'),
+ (0xAB5E, 'M', u'ɫ'),
+ (0xAB5F, 'M', u'ꭒ'),
+ (0xAB60, 'V'),
+ (0xAB69, 'M', u'ʍ'),
+ (0xAB6A, 'V'),
+ (0xAB6C, 'X'),
+ (0xAB70, 'M', u'Ꭰ'),
+ (0xAB71, 'M', u'Ꭱ'),
+ (0xAB72, 'M', u'Ꭲ'),
+ (0xAB73, 'M', u'Ꭳ'),
+ (0xAB74, 'M', u'Ꭴ'),
+ (0xAB75, 'M', u'Ꭵ'),
+ (0xAB76, 'M', u'Ꭶ'),
+ (0xAB77, 'M', u'Ꭷ'),
+ (0xAB78, 'M', u'Ꭸ'),
+ (0xAB79, 'M', u'Ꭹ'),
+ (0xAB7A, 'M', u'Ꭺ'),
+ (0xAB7B, 'M', u'Ꭻ'),
+ (0xAB7C, 'M', u'Ꭼ'),
+ (0xAB7D, 'M', u'Ꭽ'),
+ (0xAB7E, 'M', u'Ꭾ'),
+ (0xAB7F, 'M', u'Ꭿ'),
+ (0xAB80, 'M', u'Ꮀ'),
+ (0xAB81, 'M', u'Ꮁ'),
+ (0xAB82, 'M', u'Ꮂ'),
+ (0xAB83, 'M', u'Ꮃ'),
+ (0xAB84, 'M', u'Ꮄ'),
+ (0xAB85, 'M', u'Ꮅ'),
+ (0xAB86, 'M', u'Ꮆ'),
+ (0xAB87, 'M', u'Ꮇ'),
+ (0xAB88, 'M', u'Ꮈ'),
+ (0xAB89, 'M', u'Ꮉ'),
+ (0xAB8A, 'M', u'Ꮊ'),
+ (0xAB8B, 'M', u'Ꮋ'),
+ (0xAB8C, 'M', u'Ꮌ'),
+ (0xAB8D, 'M', u'Ꮍ'),
+ (0xAB8E, 'M', u'Ꮎ'),
+ (0xAB8F, 'M', u'Ꮏ'),
+ (0xAB90, 'M', u'Ꮐ'),
+ (0xAB91, 'M', u'Ꮑ'),
+ (0xAB92, 'M', u'Ꮒ'),
+ (0xAB93, 'M', u'Ꮓ'),
+ (0xAB94, 'M', u'Ꮔ'),
+ (0xAB95, 'M', u'Ꮕ'),
+ (0xAB96, 'M', u'Ꮖ'),
+ (0xAB97, 'M', u'Ꮗ'),
+ (0xAB98, 'M', u'Ꮘ'),
+ (0xAB99, 'M', u'Ꮙ'),
+ (0xAB9A, 'M', u'Ꮚ'),
+ (0xAB9B, 'M', u'Ꮛ'),
+ (0xAB9C, 'M', u'Ꮜ'),
+ (0xAB9D, 'M', u'Ꮝ'),
+ (0xAB9E, 'M', u'Ꮞ'),
+ (0xAB9F, 'M', u'Ꮟ'),
+ (0xABA0, 'M', u'Ꮠ'),
+ (0xABA1, 'M', u'Ꮡ'),
+ (0xABA2, 'M', u'Ꮢ'),
+ (0xABA3, 'M', u'Ꮣ'),
+ (0xABA4, 'M', u'Ꮤ'),
+ (0xABA5, 'M', u'Ꮥ'),
+ (0xABA6, 'M', u'Ꮦ'),
+ (0xABA7, 'M', u'Ꮧ'),
+ (0xABA8, 'M', u'Ꮨ'),
+ (0xABA9, 'M', u'Ꮩ'),
+ (0xABAA, 'M', u'Ꮪ'),
+ (0xABAB, 'M', u'Ꮫ'),
+ (0xABAC, 'M', u'Ꮬ'),
+ (0xABAD, 'M', u'Ꮭ'),
+ (0xABAE, 'M', u'Ꮮ'),
+ (0xABAF, 'M', u'Ꮯ'),
+ (0xABB0, 'M', u'Ꮰ'),
+ (0xABB1, 'M', u'Ꮱ'),
+ (0xABB2, 'M', u'Ꮲ'),
+ (0xABB3, 'M', u'Ꮳ'),
+ ]
+
+def _seg_39():
+ return [
+ (0xABB4, 'M', u'Ꮴ'),
+ (0xABB5, 'M', u'Ꮵ'),
+ (0xABB6, 'M', u'Ꮶ'),
+ (0xABB7, 'M', u'Ꮷ'),
+ (0xABB8, 'M', u'Ꮸ'),
+ (0xABB9, 'M', u'Ꮹ'),
+ (0xABBA, 'M', u'Ꮺ'),
+ (0xABBB, 'M', u'Ꮻ'),
+ (0xABBC, 'M', u'Ꮼ'),
+ (0xABBD, 'M', u'Ꮽ'),
+ (0xABBE, 'M', u'Ꮾ'),
+ (0xABBF, 'M', u'Ꮿ'),
+ (0xABC0, 'V'),
+ (0xABEE, 'X'),
+ (0xABF0, 'V'),
+ (0xABFA, 'X'),
+ (0xAC00, 'V'),
+ (0xD7A4, 'X'),
+ (0xD7B0, 'V'),
+ (0xD7C7, 'X'),
+ (0xD7CB, 'V'),
+ (0xD7FC, 'X'),
+ (0xF900, 'M', u'豈'),
+ (0xF901, 'M', u'更'),
+ (0xF902, 'M', u'車'),
+ (0xF903, 'M', u'賈'),
+ (0xF904, 'M', u'滑'),
+ (0xF905, 'M', u'串'),
+ (0xF906, 'M', u'句'),
+ (0xF907, 'M', u'龜'),
+ (0xF909, 'M', u'契'),
+ (0xF90A, 'M', u'金'),
+ (0xF90B, 'M', u'喇'),
+ (0xF90C, 'M', u'奈'),
+ (0xF90D, 'M', u'懶'),
+ (0xF90E, 'M', u'癩'),
+ (0xF90F, 'M', u'羅'),
+ (0xF910, 'M', u'蘿'),
+ (0xF911, 'M', u'螺'),
+ (0xF912, 'M', u'裸'),
+ (0xF913, 'M', u'邏'),
+ (0xF914, 'M', u'樂'),
+ (0xF915, 'M', u'洛'),
+ (0xF916, 'M', u'烙'),
+ (0xF917, 'M', u'珞'),
+ (0xF918, 'M', u'落'),
+ (0xF919, 'M', u'酪'),
+ (0xF91A, 'M', u'駱'),
+ (0xF91B, 'M', u'亂'),
+ (0xF91C, 'M', u'卵'),
+ (0xF91D, 'M', u'欄'),
+ (0xF91E, 'M', u'爛'),
+ (0xF91F, 'M', u'蘭'),
+ (0xF920, 'M', u'鸞'),
+ (0xF921, 'M', u'嵐'),
+ (0xF922, 'M', u'濫'),
+ (0xF923, 'M', u'藍'),
+ (0xF924, 'M', u'襤'),
+ (0xF925, 'M', u'拉'),
+ (0xF926, 'M', u'臘'),
+ (0xF927, 'M', u'蠟'),
+ (0xF928, 'M', u'廊'),
+ (0xF929, 'M', u'朗'),
+ (0xF92A, 'M', u'浪'),
+ (0xF92B, 'M', u'狼'),
+ (0xF92C, 'M', u'郎'),
+ (0xF92D, 'M', u'來'),
+ (0xF92E, 'M', u'冷'),
+ (0xF92F, 'M', u'勞'),
+ (0xF930, 'M', u'擄'),
+ (0xF931, 'M', u'櫓'),
+ (0xF932, 'M', u'爐'),
+ (0xF933, 'M', u'盧'),
+ (0xF934, 'M', u'老'),
+ (0xF935, 'M', u'蘆'),
+ (0xF936, 'M', u'虜'),
+ (0xF937, 'M', u'路'),
+ (0xF938, 'M', u'露'),
+ (0xF939, 'M', u'魯'),
+ (0xF93A, 'M', u'鷺'),
+ (0xF93B, 'M', u'碌'),
+ (0xF93C, 'M', u'祿'),
+ (0xF93D, 'M', u'綠'),
+ (0xF93E, 'M', u'菉'),
+ (0xF93F, 'M', u'錄'),
+ (0xF940, 'M', u'鹿'),
+ (0xF941, 'M', u'論'),
+ (0xF942, 'M', u'壟'),
+ (0xF943, 'M', u'弄'),
+ (0xF944, 'M', u'籠'),
+ (0xF945, 'M', u'聾'),
+ (0xF946, 'M', u'牢'),
+ (0xF947, 'M', u'磊'),
+ (0xF948, 'M', u'賂'),
+ (0xF949, 'M', u'雷'),
+ (0xF94A, 'M', u'壘'),
+ (0xF94B, 'M', u'屢'),
+ (0xF94C, 'M', u'樓'),
+ (0xF94D, 'M', u'淚'),
+ (0xF94E, 'M', u'漏'),
+ ]
+
+def _seg_40():
+ return [
+ (0xF94F, 'M', u'累'),
+ (0xF950, 'M', u'縷'),
+ (0xF951, 'M', u'陋'),
+ (0xF952, 'M', u'勒'),
+ (0xF953, 'M', u'肋'),
+ (0xF954, 'M', u'凜'),
+ (0xF955, 'M', u'凌'),
+ (0xF956, 'M', u'稜'),
+ (0xF957, 'M', u'綾'),
+ (0xF958, 'M', u'菱'),
+ (0xF959, 'M', u'陵'),
+ (0xF95A, 'M', u'讀'),
+ (0xF95B, 'M', u'拏'),
+ (0xF95C, 'M', u'樂'),
+ (0xF95D, 'M', u'諾'),
+ (0xF95E, 'M', u'丹'),
+ (0xF95F, 'M', u'寧'),
+ (0xF960, 'M', u'怒'),
+ (0xF961, 'M', u'率'),
+ (0xF962, 'M', u'異'),
+ (0xF963, 'M', u'北'),
+ (0xF964, 'M', u'磻'),
+ (0xF965, 'M', u'便'),
+ (0xF966, 'M', u'復'),
+ (0xF967, 'M', u'不'),
+ (0xF968, 'M', u'泌'),
+ (0xF969, 'M', u'數'),
+ (0xF96A, 'M', u'索'),
+ (0xF96B, 'M', u'參'),
+ (0xF96C, 'M', u'塞'),
+ (0xF96D, 'M', u'省'),
+ (0xF96E, 'M', u'葉'),
+ (0xF96F, 'M', u'說'),
+ (0xF970, 'M', u'殺'),
+ (0xF971, 'M', u'辰'),
+ (0xF972, 'M', u'沈'),
+ (0xF973, 'M', u'拾'),
+ (0xF974, 'M', u'若'),
+ (0xF975, 'M', u'掠'),
+ (0xF976, 'M', u'略'),
+ (0xF977, 'M', u'亮'),
+ (0xF978, 'M', u'兩'),
+ (0xF979, 'M', u'凉'),
+ (0xF97A, 'M', u'梁'),
+ (0xF97B, 'M', u'糧'),
+ (0xF97C, 'M', u'良'),
+ (0xF97D, 'M', u'諒'),
+ (0xF97E, 'M', u'量'),
+ (0xF97F, 'M', u'勵'),
+ (0xF980, 'M', u'呂'),
+ (0xF981, 'M', u'女'),
+ (0xF982, 'M', u'廬'),
+ (0xF983, 'M', u'旅'),
+ (0xF984, 'M', u'濾'),
+ (0xF985, 'M', u'礪'),
+ (0xF986, 'M', u'閭'),
+ (0xF987, 'M', u'驪'),
+ (0xF988, 'M', u'麗'),
+ (0xF989, 'M', u'黎'),
+ (0xF98A, 'M', u'力'),
+ (0xF98B, 'M', u'曆'),
+ (0xF98C, 'M', u'歷'),
+ (0xF98D, 'M', u'轢'),
+ (0xF98E, 'M', u'年'),
+ (0xF98F, 'M', u'憐'),
+ (0xF990, 'M', u'戀'),
+ (0xF991, 'M', u'撚'),
+ (0xF992, 'M', u'漣'),
+ (0xF993, 'M', u'煉'),
+ (0xF994, 'M', u'璉'),
+ (0xF995, 'M', u'秊'),
+ (0xF996, 'M', u'練'),
+ (0xF997, 'M', u'聯'),
+ (0xF998, 'M', u'輦'),
+ (0xF999, 'M', u'蓮'),
+ (0xF99A, 'M', u'連'),
+ (0xF99B, 'M', u'鍊'),
+ (0xF99C, 'M', u'列'),
+ (0xF99D, 'M', u'劣'),
+ (0xF99E, 'M', u'咽'),
+ (0xF99F, 'M', u'烈'),
+ (0xF9A0, 'M', u'裂'),
+ (0xF9A1, 'M', u'說'),
+ (0xF9A2, 'M', u'廉'),
+ (0xF9A3, 'M', u'念'),
+ (0xF9A4, 'M', u'捻'),
+ (0xF9A5, 'M', u'殮'),
+ (0xF9A6, 'M', u'簾'),
+ (0xF9A7, 'M', u'獵'),
+ (0xF9A8, 'M', u'令'),
+ (0xF9A9, 'M', u'囹'),
+ (0xF9AA, 'M', u'寧'),
+ (0xF9AB, 'M', u'嶺'),
+ (0xF9AC, 'M', u'怜'),
+ (0xF9AD, 'M', u'玲'),
+ (0xF9AE, 'M', u'瑩'),
+ (0xF9AF, 'M', u'羚'),
+ (0xF9B0, 'M', u'聆'),
+ (0xF9B1, 'M', u'鈴'),
+ (0xF9B2, 'M', u'零'),
+ ]
+
+def _seg_41():
+ return [
+ (0xF9B3, 'M', u'靈'),
+ (0xF9B4, 'M', u'領'),
+ (0xF9B5, 'M', u'例'),
+ (0xF9B6, 'M', u'禮'),
+ (0xF9B7, 'M', u'醴'),
+ (0xF9B8, 'M', u'隸'),
+ (0xF9B9, 'M', u'惡'),
+ (0xF9BA, 'M', u'了'),
+ (0xF9BB, 'M', u'僚'),
+ (0xF9BC, 'M', u'寮'),
+ (0xF9BD, 'M', u'尿'),
+ (0xF9BE, 'M', u'料'),
+ (0xF9BF, 'M', u'樂'),
+ (0xF9C0, 'M', u'燎'),
+ (0xF9C1, 'M', u'療'),
+ (0xF9C2, 'M', u'蓼'),
+ (0xF9C3, 'M', u'遼'),
+ (0xF9C4, 'M', u'龍'),
+ (0xF9C5, 'M', u'暈'),
+ (0xF9C6, 'M', u'阮'),
+ (0xF9C7, 'M', u'劉'),
+ (0xF9C8, 'M', u'杻'),
+ (0xF9C9, 'M', u'柳'),
+ (0xF9CA, 'M', u'流'),
+ (0xF9CB, 'M', u'溜'),
+ (0xF9CC, 'M', u'琉'),
+ (0xF9CD, 'M', u'留'),
+ (0xF9CE, 'M', u'硫'),
+ (0xF9CF, 'M', u'紐'),
+ (0xF9D0, 'M', u'類'),
+ (0xF9D1, 'M', u'六'),
+ (0xF9D2, 'M', u'戮'),
+ (0xF9D3, 'M', u'陸'),
+ (0xF9D4, 'M', u'倫'),
+ (0xF9D5, 'M', u'崙'),
+ (0xF9D6, 'M', u'淪'),
+ (0xF9D7, 'M', u'輪'),
+ (0xF9D8, 'M', u'律'),
+ (0xF9D9, 'M', u'慄'),
+ (0xF9DA, 'M', u'栗'),
+ (0xF9DB, 'M', u'率'),
+ (0xF9DC, 'M', u'隆'),
+ (0xF9DD, 'M', u'利'),
+ (0xF9DE, 'M', u'吏'),
+ (0xF9DF, 'M', u'履'),
+ (0xF9E0, 'M', u'易'),
+ (0xF9E1, 'M', u'李'),
+ (0xF9E2, 'M', u'梨'),
+ (0xF9E3, 'M', u'泥'),
+ (0xF9E4, 'M', u'理'),
+ (0xF9E5, 'M', u'痢'),
+ (0xF9E6, 'M', u'罹'),
+ (0xF9E7, 'M', u'裏'),
+ (0xF9E8, 'M', u'裡'),
+ (0xF9E9, 'M', u'里'),
+ (0xF9EA, 'M', u'離'),
+ (0xF9EB, 'M', u'匿'),
+ (0xF9EC, 'M', u'溺'),
+ (0xF9ED, 'M', u'吝'),
+ (0xF9EE, 'M', u'燐'),
+ (0xF9EF, 'M', u'璘'),
+ (0xF9F0, 'M', u'藺'),
+ (0xF9F1, 'M', u'隣'),
+ (0xF9F2, 'M', u'鱗'),
+ (0xF9F3, 'M', u'麟'),
+ (0xF9F4, 'M', u'林'),
+ (0xF9F5, 'M', u'淋'),
+ (0xF9F6, 'M', u'臨'),
+ (0xF9F7, 'M', u'立'),
+ (0xF9F8, 'M', u'笠'),
+ (0xF9F9, 'M', u'粒'),
+ (0xF9FA, 'M', u'狀'),
+ (0xF9FB, 'M', u'炙'),
+ (0xF9FC, 'M', u'識'),
+ (0xF9FD, 'M', u'什'),
+ (0xF9FE, 'M', u'茶'),
+ (0xF9FF, 'M', u'刺'),
+ (0xFA00, 'M', u'切'),
+ (0xFA01, 'M', u'度'),
+ (0xFA02, 'M', u'拓'),
+ (0xFA03, 'M', u'糖'),
+ (0xFA04, 'M', u'宅'),
+ (0xFA05, 'M', u'洞'),
+ (0xFA06, 'M', u'暴'),
+ (0xFA07, 'M', u'輻'),
+ (0xFA08, 'M', u'行'),
+ (0xFA09, 'M', u'降'),
+ (0xFA0A, 'M', u'見'),
+ (0xFA0B, 'M', u'廓'),
+ (0xFA0C, 'M', u'兀'),
+ (0xFA0D, 'M', u'嗀'),
+ (0xFA0E, 'V'),
+ (0xFA10, 'M', u'塚'),
+ (0xFA11, 'V'),
+ (0xFA12, 'M', u'晴'),
+ (0xFA13, 'V'),
+ (0xFA15, 'M', u'凞'),
+ (0xFA16, 'M', u'猪'),
+ (0xFA17, 'M', u'益'),
+ (0xFA18, 'M', u'礼'),
+ ]
+
+def _seg_42():
+ return [
+ (0xFA19, 'M', u'神'),
+ (0xFA1A, 'M', u'祥'),
+ (0xFA1B, 'M', u'福'),
+ (0xFA1C, 'M', u'靖'),
+ (0xFA1D, 'M', u'精'),
+ (0xFA1E, 'M', u'羽'),
+ (0xFA1F, 'V'),
+ (0xFA20, 'M', u'蘒'),
+ (0xFA21, 'V'),
+ (0xFA22, 'M', u'諸'),
+ (0xFA23, 'V'),
+ (0xFA25, 'M', u'逸'),
+ (0xFA26, 'M', u'都'),
+ (0xFA27, 'V'),
+ (0xFA2A, 'M', u'飯'),
+ (0xFA2B, 'M', u'飼'),
+ (0xFA2C, 'M', u'館'),
+ (0xFA2D, 'M', u'鶴'),
+ (0xFA2E, 'M', u'郞'),
+ (0xFA2F, 'M', u'隷'),
+ (0xFA30, 'M', u'侮'),
+ (0xFA31, 'M', u'僧'),
+ (0xFA32, 'M', u'免'),
+ (0xFA33, 'M', u'勉'),
+ (0xFA34, 'M', u'勤'),
+ (0xFA35, 'M', u'卑'),
+ (0xFA36, 'M', u'喝'),
+ (0xFA37, 'M', u'嘆'),
+ (0xFA38, 'M', u'器'),
+ (0xFA39, 'M', u'塀'),
+ (0xFA3A, 'M', u'墨'),
+ (0xFA3B, 'M', u'層'),
+ (0xFA3C, 'M', u'屮'),
+ (0xFA3D, 'M', u'悔'),
+ (0xFA3E, 'M', u'慨'),
+ (0xFA3F, 'M', u'憎'),
+ (0xFA40, 'M', u'懲'),
+ (0xFA41, 'M', u'敏'),
+ (0xFA42, 'M', u'既'),
+ (0xFA43, 'M', u'暑'),
+ (0xFA44, 'M', u'梅'),
+ (0xFA45, 'M', u'海'),
+ (0xFA46, 'M', u'渚'),
+ (0xFA47, 'M', u'漢'),
+ (0xFA48, 'M', u'煮'),
+ (0xFA49, 'M', u'爫'),
+ (0xFA4A, 'M', u'琢'),
+ (0xFA4B, 'M', u'碑'),
+ (0xFA4C, 'M', u'社'),
+ (0xFA4D, 'M', u'祉'),
+ (0xFA4E, 'M', u'祈'),
+ (0xFA4F, 'M', u'祐'),
+ (0xFA50, 'M', u'祖'),
+ (0xFA51, 'M', u'祝'),
+ (0xFA52, 'M', u'禍'),
+ (0xFA53, 'M', u'禎'),
+ (0xFA54, 'M', u'穀'),
+ (0xFA55, 'M', u'突'),
+ (0xFA56, 'M', u'節'),
+ (0xFA57, 'M', u'練'),
+ (0xFA58, 'M', u'縉'),
+ (0xFA59, 'M', u'繁'),
+ (0xFA5A, 'M', u'署'),
+ (0xFA5B, 'M', u'者'),
+ (0xFA5C, 'M', u'臭'),
+ (0xFA5D, 'M', u'艹'),
+ (0xFA5F, 'M', u'著'),
+ (0xFA60, 'M', u'褐'),
+ (0xFA61, 'M', u'視'),
+ (0xFA62, 'M', u'謁'),
+ (0xFA63, 'M', u'謹'),
+ (0xFA64, 'M', u'賓'),
+ (0xFA65, 'M', u'贈'),
+ (0xFA66, 'M', u'辶'),
+ (0xFA67, 'M', u'逸'),
+ (0xFA68, 'M', u'難'),
+ (0xFA69, 'M', u'響'),
+ (0xFA6A, 'M', u'頻'),
+ (0xFA6B, 'M', u'恵'),
+ (0xFA6C, 'M', u'𤋮'),
+ (0xFA6D, 'M', u'舘'),
+ (0xFA6E, 'X'),
+ (0xFA70, 'M', u'並'),
+ (0xFA71, 'M', u'况'),
+ (0xFA72, 'M', u'全'),
+ (0xFA73, 'M', u'侀'),
+ (0xFA74, 'M', u'充'),
+ (0xFA75, 'M', u'冀'),
+ (0xFA76, 'M', u'勇'),
+ (0xFA77, 'M', u'勺'),
+ (0xFA78, 'M', u'喝'),
+ (0xFA79, 'M', u'啕'),
+ (0xFA7A, 'M', u'喙'),
+ (0xFA7B, 'M', u'嗢'),
+ (0xFA7C, 'M', u'塚'),
+ (0xFA7D, 'M', u'墳'),
+ (0xFA7E, 'M', u'奄'),
+ (0xFA7F, 'M', u'奔'),
+ (0xFA80, 'M', u'婢'),
+ (0xFA81, 'M', u'嬨'),
+ ]
+
+def _seg_43():
+ return [
+ (0xFA82, 'M', u'廒'),
+ (0xFA83, 'M', u'廙'),
+ (0xFA84, 'M', u'彩'),
+ (0xFA85, 'M', u'徭'),
+ (0xFA86, 'M', u'惘'),
+ (0xFA87, 'M', u'慎'),
+ (0xFA88, 'M', u'愈'),
+ (0xFA89, 'M', u'憎'),
+ (0xFA8A, 'M', u'慠'),
+ (0xFA8B, 'M', u'懲'),
+ (0xFA8C, 'M', u'戴'),
+ (0xFA8D, 'M', u'揄'),
+ (0xFA8E, 'M', u'搜'),
+ (0xFA8F, 'M', u'摒'),
+ (0xFA90, 'M', u'敖'),
+ (0xFA91, 'M', u'晴'),
+ (0xFA92, 'M', u'朗'),
+ (0xFA93, 'M', u'望'),
+ (0xFA94, 'M', u'杖'),
+ (0xFA95, 'M', u'歹'),
+ (0xFA96, 'M', u'殺'),
+ (0xFA97, 'M', u'流'),
+ (0xFA98, 'M', u'滛'),
+ (0xFA99, 'M', u'滋'),
+ (0xFA9A, 'M', u'漢'),
+ (0xFA9B, 'M', u'瀞'),
+ (0xFA9C, 'M', u'煮'),
+ (0xFA9D, 'M', u'瞧'),
+ (0xFA9E, 'M', u'爵'),
+ (0xFA9F, 'M', u'犯'),
+ (0xFAA0, 'M', u'猪'),
+ (0xFAA1, 'M', u'瑱'),
+ (0xFAA2, 'M', u'甆'),
+ (0xFAA3, 'M', u'画'),
+ (0xFAA4, 'M', u'瘝'),
+ (0xFAA5, 'M', u'瘟'),
+ (0xFAA6, 'M', u'益'),
+ (0xFAA7, 'M', u'盛'),
+ (0xFAA8, 'M', u'直'),
+ (0xFAA9, 'M', u'睊'),
+ (0xFAAA, 'M', u'着'),
+ (0xFAAB, 'M', u'磌'),
+ (0xFAAC, 'M', u'窱'),
+ (0xFAAD, 'M', u'節'),
+ (0xFAAE, 'M', u'类'),
+ (0xFAAF, 'M', u'絛'),
+ (0xFAB0, 'M', u'練'),
+ (0xFAB1, 'M', u'缾'),
+ (0xFAB2, 'M', u'者'),
+ (0xFAB3, 'M', u'荒'),
+ (0xFAB4, 'M', u'華'),
+ (0xFAB5, 'M', u'蝹'),
+ (0xFAB6, 'M', u'襁'),
+ (0xFAB7, 'M', u'覆'),
+ (0xFAB8, 'M', u'視'),
+ (0xFAB9, 'M', u'調'),
+ (0xFABA, 'M', u'諸'),
+ (0xFABB, 'M', u'請'),
+ (0xFABC, 'M', u'謁'),
+ (0xFABD, 'M', u'諾'),
+ (0xFABE, 'M', u'諭'),
+ (0xFABF, 'M', u'謹'),
+ (0xFAC0, 'M', u'變'),
+ (0xFAC1, 'M', u'贈'),
+ (0xFAC2, 'M', u'輸'),
+ (0xFAC3, 'M', u'遲'),
+ (0xFAC4, 'M', u'醙'),
+ (0xFAC5, 'M', u'鉶'),
+ (0xFAC6, 'M', u'陼'),
+ (0xFAC7, 'M', u'難'),
+ (0xFAC8, 'M', u'靖'),
+ (0xFAC9, 'M', u'韛'),
+ (0xFACA, 'M', u'響'),
+ (0xFACB, 'M', u'頋'),
+ (0xFACC, 'M', u'頻'),
+ (0xFACD, 'M', u'鬒'),
+ (0xFACE, 'M', u'龜'),
+ (0xFACF, 'M', u'𢡊'),
+ (0xFAD0, 'M', u'𢡄'),
+ (0xFAD1, 'M', u'𣏕'),
+ (0xFAD2, 'M', u'㮝'),
+ (0xFAD3, 'M', u'䀘'),
+ (0xFAD4, 'M', u'䀹'),
+ (0xFAD5, 'M', u'𥉉'),
+ (0xFAD6, 'M', u'𥳐'),
+ (0xFAD7, 'M', u'𧻓'),
+ (0xFAD8, 'M', u'齃'),
+ (0xFAD9, 'M', u'龎'),
+ (0xFADA, 'X'),
+ (0xFB00, 'M', u'ff'),
+ (0xFB01, 'M', u'fi'),
+ (0xFB02, 'M', u'fl'),
+ (0xFB03, 'M', u'ffi'),
+ (0xFB04, 'M', u'ffl'),
+ (0xFB05, 'M', u'st'),
+ (0xFB07, 'X'),
+ (0xFB13, 'M', u'մն'),
+ (0xFB14, 'M', u'մե'),
+ (0xFB15, 'M', u'մի'),
+ (0xFB16, 'M', u'վն'),
+ ]
+
+def _seg_44():
+ return [
+ (0xFB17, 'M', u'մխ'),
+ (0xFB18, 'X'),
+ (0xFB1D, 'M', u'יִ'),
+ (0xFB1E, 'V'),
+ (0xFB1F, 'M', u'ײַ'),
+ (0xFB20, 'M', u'ע'),
+ (0xFB21, 'M', u'א'),
+ (0xFB22, 'M', u'ד'),
+ (0xFB23, 'M', u'ה'),
+ (0xFB24, 'M', u'כ'),
+ (0xFB25, 'M', u'ל'),
+ (0xFB26, 'M', u'ם'),
+ (0xFB27, 'M', u'ר'),
+ (0xFB28, 'M', u'ת'),
+ (0xFB29, '3', u'+'),
+ (0xFB2A, 'M', u'שׁ'),
+ (0xFB2B, 'M', u'שׂ'),
+ (0xFB2C, 'M', u'שּׁ'),
+ (0xFB2D, 'M', u'שּׂ'),
+ (0xFB2E, 'M', u'אַ'),
+ (0xFB2F, 'M', u'אָ'),
+ (0xFB30, 'M', u'אּ'),
+ (0xFB31, 'M', u'בּ'),
+ (0xFB32, 'M', u'גּ'),
+ (0xFB33, 'M', u'דּ'),
+ (0xFB34, 'M', u'הּ'),
+ (0xFB35, 'M', u'וּ'),
+ (0xFB36, 'M', u'זּ'),
+ (0xFB37, 'X'),
+ (0xFB38, 'M', u'טּ'),
+ (0xFB39, 'M', u'יּ'),
+ (0xFB3A, 'M', u'ךּ'),
+ (0xFB3B, 'M', u'כּ'),
+ (0xFB3C, 'M', u'לּ'),
+ (0xFB3D, 'X'),
+ (0xFB3E, 'M', u'מּ'),
+ (0xFB3F, 'X'),
+ (0xFB40, 'M', u'נּ'),
+ (0xFB41, 'M', u'סּ'),
+ (0xFB42, 'X'),
+ (0xFB43, 'M', u'ףּ'),
+ (0xFB44, 'M', u'פּ'),
+ (0xFB45, 'X'),
+ (0xFB46, 'M', u'צּ'),
+ (0xFB47, 'M', u'קּ'),
+ (0xFB48, 'M', u'רּ'),
+ (0xFB49, 'M', u'שּ'),
+ (0xFB4A, 'M', u'תּ'),
+ (0xFB4B, 'M', u'וֹ'),
+ (0xFB4C, 'M', u'בֿ'),
+ (0xFB4D, 'M', u'כֿ'),
+ (0xFB4E, 'M', u'פֿ'),
+ (0xFB4F, 'M', u'אל'),
+ (0xFB50, 'M', u'ٱ'),
+ (0xFB52, 'M', u'ٻ'),
+ (0xFB56, 'M', u'پ'),
+ (0xFB5A, 'M', u'ڀ'),
+ (0xFB5E, 'M', u'ٺ'),
+ (0xFB62, 'M', u'ٿ'),
+ (0xFB66, 'M', u'ٹ'),
+ (0xFB6A, 'M', u'ڤ'),
+ (0xFB6E, 'M', u'ڦ'),
+ (0xFB72, 'M', u'ڄ'),
+ (0xFB76, 'M', u'ڃ'),
+ (0xFB7A, 'M', u'چ'),
+ (0xFB7E, 'M', u'ڇ'),
+ (0xFB82, 'M', u'ڍ'),
+ (0xFB84, 'M', u'ڌ'),
+ (0xFB86, 'M', u'ڎ'),
+ (0xFB88, 'M', u'ڈ'),
+ (0xFB8A, 'M', u'ژ'),
+ (0xFB8C, 'M', u'ڑ'),
+ (0xFB8E, 'M', u'ک'),
+ (0xFB92, 'M', u'گ'),
+ (0xFB96, 'M', u'ڳ'),
+ (0xFB9A, 'M', u'ڱ'),
+ (0xFB9E, 'M', u'ں'),
+ (0xFBA0, 'M', u'ڻ'),
+ (0xFBA4, 'M', u'ۀ'),
+ (0xFBA6, 'M', u'ہ'),
+ (0xFBAA, 'M', u'ھ'),
+ (0xFBAE, 'M', u'ے'),
+ (0xFBB0, 'M', u'ۓ'),
+ (0xFBB2, 'V'),
+ (0xFBC2, 'X'),
+ (0xFBD3, 'M', u'ڭ'),
+ (0xFBD7, 'M', u'ۇ'),
+ (0xFBD9, 'M', u'ۆ'),
+ (0xFBDB, 'M', u'ۈ'),
+ (0xFBDD, 'M', u'ۇٴ'),
+ (0xFBDE, 'M', u'ۋ'),
+ (0xFBE0, 'M', u'ۅ'),
+ (0xFBE2, 'M', u'ۉ'),
+ (0xFBE4, 'M', u'ې'),
+ (0xFBE8, 'M', u'ى'),
+ (0xFBEA, 'M', u'ئا'),
+ (0xFBEC, 'M', u'ئە'),
+ (0xFBEE, 'M', u'ئو'),
+ (0xFBF0, 'M', u'ئۇ'),
+ (0xFBF2, 'M', u'ئۆ'),
+ ]
+
+def _seg_45():
+ return [
+ (0xFBF4, 'M', u'ئۈ'),
+ (0xFBF6, 'M', u'ئې'),
+ (0xFBF9, 'M', u'ئى'),
+ (0xFBFC, 'M', u'ی'),
+ (0xFC00, 'M', u'ئج'),
+ (0xFC01, 'M', u'ئح'),
+ (0xFC02, 'M', u'ئم'),
+ (0xFC03, 'M', u'ئى'),
+ (0xFC04, 'M', u'ئي'),
+ (0xFC05, 'M', u'بج'),
+ (0xFC06, 'M', u'بح'),
+ (0xFC07, 'M', u'بخ'),
+ (0xFC08, 'M', u'بم'),
+ (0xFC09, 'M', u'بى'),
+ (0xFC0A, 'M', u'بي'),
+ (0xFC0B, 'M', u'تج'),
+ (0xFC0C, 'M', u'تح'),
+ (0xFC0D, 'M', u'تخ'),
+ (0xFC0E, 'M', u'تم'),
+ (0xFC0F, 'M', u'تى'),
+ (0xFC10, 'M', u'تي'),
+ (0xFC11, 'M', u'ثج'),
+ (0xFC12, 'M', u'ثم'),
+ (0xFC13, 'M', u'ثى'),
+ (0xFC14, 'M', u'ثي'),
+ (0xFC15, 'M', u'جح'),
+ (0xFC16, 'M', u'جم'),
+ (0xFC17, 'M', u'حج'),
+ (0xFC18, 'M', u'حم'),
+ (0xFC19, 'M', u'خج'),
+ (0xFC1A, 'M', u'خح'),
+ (0xFC1B, 'M', u'خم'),
+ (0xFC1C, 'M', u'سج'),
+ (0xFC1D, 'M', u'سح'),
+ (0xFC1E, 'M', u'سخ'),
+ (0xFC1F, 'M', u'سم'),
+ (0xFC20, 'M', u'صح'),
+ (0xFC21, 'M', u'صم'),
+ (0xFC22, 'M', u'ضج'),
+ (0xFC23, 'M', u'ضح'),
+ (0xFC24, 'M', u'ضخ'),
+ (0xFC25, 'M', u'ضم'),
+ (0xFC26, 'M', u'طح'),
+ (0xFC27, 'M', u'طم'),
+ (0xFC28, 'M', u'ظم'),
+ (0xFC29, 'M', u'عج'),
+ (0xFC2A, 'M', u'عم'),
+ (0xFC2B, 'M', u'غج'),
+ (0xFC2C, 'M', u'غم'),
+ (0xFC2D, 'M', u'فج'),
+ (0xFC2E, 'M', u'فح'),
+ (0xFC2F, 'M', u'فخ'),
+ (0xFC30, 'M', u'فم'),
+ (0xFC31, 'M', u'فى'),
+ (0xFC32, 'M', u'في'),
+ (0xFC33, 'M', u'قح'),
+ (0xFC34, 'M', u'قم'),
+ (0xFC35, 'M', u'قى'),
+ (0xFC36, 'M', u'قي'),
+ (0xFC37, 'M', u'كا'),
+ (0xFC38, 'M', u'كج'),
+ (0xFC39, 'M', u'كح'),
+ (0xFC3A, 'M', u'كخ'),
+ (0xFC3B, 'M', u'كل'),
+ (0xFC3C, 'M', u'كم'),
+ (0xFC3D, 'M', u'كى'),
+ (0xFC3E, 'M', u'كي'),
+ (0xFC3F, 'M', u'لج'),
+ (0xFC40, 'M', u'لح'),
+ (0xFC41, 'M', u'لخ'),
+ (0xFC42, 'M', u'لم'),
+ (0xFC43, 'M', u'لى'),
+ (0xFC44, 'M', u'لي'),
+ (0xFC45, 'M', u'مج'),
+ (0xFC46, 'M', u'مح'),
+ (0xFC47, 'M', u'مخ'),
+ (0xFC48, 'M', u'مم'),
+ (0xFC49, 'M', u'مى'),
+ (0xFC4A, 'M', u'مي'),
+ (0xFC4B, 'M', u'نج'),
+ (0xFC4C, 'M', u'نح'),
+ (0xFC4D, 'M', u'نخ'),
+ (0xFC4E, 'M', u'نم'),
+ (0xFC4F, 'M', u'نى'),
+ (0xFC50, 'M', u'ني'),
+ (0xFC51, 'M', u'هج'),
+ (0xFC52, 'M', u'هم'),
+ (0xFC53, 'M', u'هى'),
+ (0xFC54, 'M', u'هي'),
+ (0xFC55, 'M', u'يج'),
+ (0xFC56, 'M', u'يح'),
+ (0xFC57, 'M', u'يخ'),
+ (0xFC58, 'M', u'يم'),
+ (0xFC59, 'M', u'يى'),
+ (0xFC5A, 'M', u'يي'),
+ (0xFC5B, 'M', u'ذٰ'),
+ (0xFC5C, 'M', u'رٰ'),
+ (0xFC5D, 'M', u'ىٰ'),
+ (0xFC5E, '3', u' ٌّ'),
+ (0xFC5F, '3', u' ٍّ'),
+ ]
+
+def _seg_46():
+ return [
+ (0xFC60, '3', u' َّ'),
+ (0xFC61, '3', u' ُّ'),
+ (0xFC62, '3', u' ِّ'),
+ (0xFC63, '3', u' ّٰ'),
+ (0xFC64, 'M', u'ئر'),
+ (0xFC65, 'M', u'ئز'),
+ (0xFC66, 'M', u'ئم'),
+ (0xFC67, 'M', u'ئن'),
+ (0xFC68, 'M', u'ئى'),
+ (0xFC69, 'M', u'ئي'),
+ (0xFC6A, 'M', u'بر'),
+ (0xFC6B, 'M', u'بز'),
+ (0xFC6C, 'M', u'بم'),
+ (0xFC6D, 'M', u'بن'),
+ (0xFC6E, 'M', u'بى'),
+ (0xFC6F, 'M', u'بي'),
+ (0xFC70, 'M', u'تر'),
+ (0xFC71, 'M', u'تز'),
+ (0xFC72, 'M', u'تم'),
+ (0xFC73, 'M', u'تن'),
+ (0xFC74, 'M', u'تى'),
+ (0xFC75, 'M', u'تي'),
+ (0xFC76, 'M', u'ثر'),
+ (0xFC77, 'M', u'ثز'),
+ (0xFC78, 'M', u'ثم'),
+ (0xFC79, 'M', u'ثن'),
+ (0xFC7A, 'M', u'ثى'),
+ (0xFC7B, 'M', u'ثي'),
+ (0xFC7C, 'M', u'فى'),
+ (0xFC7D, 'M', u'في'),
+ (0xFC7E, 'M', u'قى'),
+ (0xFC7F, 'M', u'قي'),
+ (0xFC80, 'M', u'كا'),
+ (0xFC81, 'M', u'كل'),
+ (0xFC82, 'M', u'كم'),
+ (0xFC83, 'M', u'كى'),
+ (0xFC84, 'M', u'كي'),
+ (0xFC85, 'M', u'لم'),
+ (0xFC86, 'M', u'لى'),
+ (0xFC87, 'M', u'لي'),
+ (0xFC88, 'M', u'ما'),
+ (0xFC89, 'M', u'مم'),
+ (0xFC8A, 'M', u'نر'),
+ (0xFC8B, 'M', u'نز'),
+ (0xFC8C, 'M', u'نم'),
+ (0xFC8D, 'M', u'نن'),
+ (0xFC8E, 'M', u'نى'),
+ (0xFC8F, 'M', u'ني'),
+ (0xFC90, 'M', u'ىٰ'),
+ (0xFC91, 'M', u'ير'),
+ (0xFC92, 'M', u'يز'),
+ (0xFC93, 'M', u'يم'),
+ (0xFC94, 'M', u'ين'),
+ (0xFC95, 'M', u'يى'),
+ (0xFC96, 'M', u'يي'),
+ (0xFC97, 'M', u'ئج'),
+ (0xFC98, 'M', u'ئح'),
+ (0xFC99, 'M', u'ئخ'),
+ (0xFC9A, 'M', u'ئم'),
+ (0xFC9B, 'M', u'ئه'),
+ (0xFC9C, 'M', u'بج'),
+ (0xFC9D, 'M', u'بح'),
+ (0xFC9E, 'M', u'بخ'),
+ (0xFC9F, 'M', u'بم'),
+ (0xFCA0, 'M', u'به'),
+ (0xFCA1, 'M', u'تج'),
+ (0xFCA2, 'M', u'تح'),
+ (0xFCA3, 'M', u'تخ'),
+ (0xFCA4, 'M', u'تم'),
+ (0xFCA5, 'M', u'ته'),
+ (0xFCA6, 'M', u'ثم'),
+ (0xFCA7, 'M', u'جح'),
+ (0xFCA8, 'M', u'جم'),
+ (0xFCA9, 'M', u'حج'),
+ (0xFCAA, 'M', u'حم'),
+ (0xFCAB, 'M', u'خج'),
+ (0xFCAC, 'M', u'خم'),
+ (0xFCAD, 'M', u'سج'),
+ (0xFCAE, 'M', u'سح'),
+ (0xFCAF, 'M', u'سخ'),
+ (0xFCB0, 'M', u'سم'),
+ (0xFCB1, 'M', u'صح'),
+ (0xFCB2, 'M', u'صخ'),
+ (0xFCB3, 'M', u'صم'),
+ (0xFCB4, 'M', u'ضج'),
+ (0xFCB5, 'M', u'ضح'),
+ (0xFCB6, 'M', u'ضخ'),
+ (0xFCB7, 'M', u'ضم'),
+ (0xFCB8, 'M', u'طح'),
+ (0xFCB9, 'M', u'ظم'),
+ (0xFCBA, 'M', u'عج'),
+ (0xFCBB, 'M', u'عم'),
+ (0xFCBC, 'M', u'غج'),
+ (0xFCBD, 'M', u'غم'),
+ (0xFCBE, 'M', u'فج'),
+ (0xFCBF, 'M', u'فح'),
+ (0xFCC0, 'M', u'فخ'),
+ (0xFCC1, 'M', u'فم'),
+ (0xFCC2, 'M', u'قح'),
+ (0xFCC3, 'M', u'قم'),
+ ]
+
+def _seg_47():
+ return [
+ (0xFCC4, 'M', u'كج'),
+ (0xFCC5, 'M', u'كح'),
+ (0xFCC6, 'M', u'كخ'),
+ (0xFCC7, 'M', u'كل'),
+ (0xFCC8, 'M', u'كم'),
+ (0xFCC9, 'M', u'لج'),
+ (0xFCCA, 'M', u'لح'),
+ (0xFCCB, 'M', u'لخ'),
+ (0xFCCC, 'M', u'لم'),
+ (0xFCCD, 'M', u'له'),
+ (0xFCCE, 'M', u'مج'),
+ (0xFCCF, 'M', u'مح'),
+ (0xFCD0, 'M', u'مخ'),
+ (0xFCD1, 'M', u'مم'),
+ (0xFCD2, 'M', u'نج'),
+ (0xFCD3, 'M', u'نح'),
+ (0xFCD4, 'M', u'نخ'),
+ (0xFCD5, 'M', u'نم'),
+ (0xFCD6, 'M', u'نه'),
+ (0xFCD7, 'M', u'هج'),
+ (0xFCD8, 'M', u'هم'),
+ (0xFCD9, 'M', u'هٰ'),
+ (0xFCDA, 'M', u'يج'),
+ (0xFCDB, 'M', u'يح'),
+ (0xFCDC, 'M', u'يخ'),
+ (0xFCDD, 'M', u'يم'),
+ (0xFCDE, 'M', u'يه'),
+ (0xFCDF, 'M', u'ئم'),
+ (0xFCE0, 'M', u'ئه'),
+ (0xFCE1, 'M', u'بم'),
+ (0xFCE2, 'M', u'به'),
+ (0xFCE3, 'M', u'تم'),
+ (0xFCE4, 'M', u'ته'),
+ (0xFCE5, 'M', u'ثم'),
+ (0xFCE6, 'M', u'ثه'),
+ (0xFCE7, 'M', u'سم'),
+ (0xFCE8, 'M', u'سه'),
+ (0xFCE9, 'M', u'شم'),
+ (0xFCEA, 'M', u'شه'),
+ (0xFCEB, 'M', u'كل'),
+ (0xFCEC, 'M', u'كم'),
+ (0xFCED, 'M', u'لم'),
+ (0xFCEE, 'M', u'نم'),
+ (0xFCEF, 'M', u'نه'),
+ (0xFCF0, 'M', u'يم'),
+ (0xFCF1, 'M', u'يه'),
+ (0xFCF2, 'M', u'ـَّ'),
+ (0xFCF3, 'M', u'ـُّ'),
+ (0xFCF4, 'M', u'ـِّ'),
+ (0xFCF5, 'M', u'طى'),
+ (0xFCF6, 'M', u'طي'),
+ (0xFCF7, 'M', u'عى'),
+ (0xFCF8, 'M', u'عي'),
+ (0xFCF9, 'M', u'غى'),
+ (0xFCFA, 'M', u'غي'),
+ (0xFCFB, 'M', u'سى'),
+ (0xFCFC, 'M', u'سي'),
+ (0xFCFD, 'M', u'شى'),
+ (0xFCFE, 'M', u'شي'),
+ (0xFCFF, 'M', u'حى'),
+ (0xFD00, 'M', u'حي'),
+ (0xFD01, 'M', u'جى'),
+ (0xFD02, 'M', u'جي'),
+ (0xFD03, 'M', u'خى'),
+ (0xFD04, 'M', u'خي'),
+ (0xFD05, 'M', u'صى'),
+ (0xFD06, 'M', u'صي'),
+ (0xFD07, 'M', u'ضى'),
+ (0xFD08, 'M', u'ضي'),
+ (0xFD09, 'M', u'شج'),
+ (0xFD0A, 'M', u'شح'),
+ (0xFD0B, 'M', u'شخ'),
+ (0xFD0C, 'M', u'شم'),
+ (0xFD0D, 'M', u'شر'),
+ (0xFD0E, 'M', u'سر'),
+ (0xFD0F, 'M', u'صر'),
+ (0xFD10, 'M', u'ضر'),
+ (0xFD11, 'M', u'طى'),
+ (0xFD12, 'M', u'طي'),
+ (0xFD13, 'M', u'عى'),
+ (0xFD14, 'M', u'عي'),
+ (0xFD15, 'M', u'غى'),
+ (0xFD16, 'M', u'غي'),
+ (0xFD17, 'M', u'سى'),
+ (0xFD18, 'M', u'سي'),
+ (0xFD19, 'M', u'شى'),
+ (0xFD1A, 'M', u'شي'),
+ (0xFD1B, 'M', u'حى'),
+ (0xFD1C, 'M', u'حي'),
+ (0xFD1D, 'M', u'جى'),
+ (0xFD1E, 'M', u'جي'),
+ (0xFD1F, 'M', u'خى'),
+ (0xFD20, 'M', u'خي'),
+ (0xFD21, 'M', u'صى'),
+ (0xFD22, 'M', u'صي'),
+ (0xFD23, 'M', u'ضى'),
+ (0xFD24, 'M', u'ضي'),
+ (0xFD25, 'M', u'شج'),
+ (0xFD26, 'M', u'شح'),
+ (0xFD27, 'M', u'شخ'),
+ ]
+
+def _seg_48():
+ return [
+ (0xFD28, 'M', u'شم'),
+ (0xFD29, 'M', u'شر'),
+ (0xFD2A, 'M', u'سر'),
+ (0xFD2B, 'M', u'صر'),
+ (0xFD2C, 'M', u'ضر'),
+ (0xFD2D, 'M', u'شج'),
+ (0xFD2E, 'M', u'شح'),
+ (0xFD2F, 'M', u'شخ'),
+ (0xFD30, 'M', u'شم'),
+ (0xFD31, 'M', u'سه'),
+ (0xFD32, 'M', u'شه'),
+ (0xFD33, 'M', u'طم'),
+ (0xFD34, 'M', u'سج'),
+ (0xFD35, 'M', u'سح'),
+ (0xFD36, 'M', u'سخ'),
+ (0xFD37, 'M', u'شج'),
+ (0xFD38, 'M', u'شح'),
+ (0xFD39, 'M', u'شخ'),
+ (0xFD3A, 'M', u'طم'),
+ (0xFD3B, 'M', u'ظم'),
+ (0xFD3C, 'M', u'اً'),
+ (0xFD3E, 'V'),
+ (0xFD40, 'X'),
+ (0xFD50, 'M', u'تجم'),
+ (0xFD51, 'M', u'تحج'),
+ (0xFD53, 'M', u'تحم'),
+ (0xFD54, 'M', u'تخم'),
+ (0xFD55, 'M', u'تمج'),
+ (0xFD56, 'M', u'تمح'),
+ (0xFD57, 'M', u'تمخ'),
+ (0xFD58, 'M', u'جمح'),
+ (0xFD5A, 'M', u'حمي'),
+ (0xFD5B, 'M', u'حمى'),
+ (0xFD5C, 'M', u'سحج'),
+ (0xFD5D, 'M', u'سجح'),
+ (0xFD5E, 'M', u'سجى'),
+ (0xFD5F, 'M', u'سمح'),
+ (0xFD61, 'M', u'سمج'),
+ (0xFD62, 'M', u'سمم'),
+ (0xFD64, 'M', u'صحح'),
+ (0xFD66, 'M', u'صمم'),
+ (0xFD67, 'M', u'شحم'),
+ (0xFD69, 'M', u'شجي'),
+ (0xFD6A, 'M', u'شمخ'),
+ (0xFD6C, 'M', u'شمم'),
+ (0xFD6E, 'M', u'ضحى'),
+ (0xFD6F, 'M', u'ضخم'),
+ (0xFD71, 'M', u'طمح'),
+ (0xFD73, 'M', u'طمم'),
+ (0xFD74, 'M', u'طمي'),
+ (0xFD75, 'M', u'عجم'),
+ (0xFD76, 'M', u'عمم'),
+ (0xFD78, 'M', u'عمى'),
+ (0xFD79, 'M', u'غمم'),
+ (0xFD7A, 'M', u'غمي'),
+ (0xFD7B, 'M', u'غمى'),
+ (0xFD7C, 'M', u'فخم'),
+ (0xFD7E, 'M', u'قمح'),
+ (0xFD7F, 'M', u'قمم'),
+ (0xFD80, 'M', u'لحم'),
+ (0xFD81, 'M', u'لحي'),
+ (0xFD82, 'M', u'لحى'),
+ (0xFD83, 'M', u'لجج'),
+ (0xFD85, 'M', u'لخم'),
+ (0xFD87, 'M', u'لمح'),
+ (0xFD89, 'M', u'محج'),
+ (0xFD8A, 'M', u'محم'),
+ (0xFD8B, 'M', u'محي'),
+ (0xFD8C, 'M', u'مجح'),
+ (0xFD8D, 'M', u'مجم'),
+ (0xFD8E, 'M', u'مخج'),
+ (0xFD8F, 'M', u'مخم'),
+ (0xFD90, 'X'),
+ (0xFD92, 'M', u'مجخ'),
+ (0xFD93, 'M', u'همج'),
+ (0xFD94, 'M', u'همم'),
+ (0xFD95, 'M', u'نحم'),
+ (0xFD96, 'M', u'نحى'),
+ (0xFD97, 'M', u'نجم'),
+ (0xFD99, 'M', u'نجى'),
+ (0xFD9A, 'M', u'نمي'),
+ (0xFD9B, 'M', u'نمى'),
+ (0xFD9C, 'M', u'يمم'),
+ (0xFD9E, 'M', u'بخي'),
+ (0xFD9F, 'M', u'تجي'),
+ (0xFDA0, 'M', u'تجى'),
+ (0xFDA1, 'M', u'تخي'),
+ (0xFDA2, 'M', u'تخى'),
+ (0xFDA3, 'M', u'تمي'),
+ (0xFDA4, 'M', u'تمى'),
+ (0xFDA5, 'M', u'جمي'),
+ (0xFDA6, 'M', u'جحى'),
+ (0xFDA7, 'M', u'جمى'),
+ (0xFDA8, 'M', u'سخى'),
+ (0xFDA9, 'M', u'صحي'),
+ (0xFDAA, 'M', u'شحي'),
+ (0xFDAB, 'M', u'ضحي'),
+ (0xFDAC, 'M', u'لجي'),
+ (0xFDAD, 'M', u'لمي'),
+ (0xFDAE, 'M', u'يحي'),
+ ]
+
+def _seg_49():
+ return [
+ (0xFDAF, 'M', u'يجي'),
+ (0xFDB0, 'M', u'يمي'),
+ (0xFDB1, 'M', u'ممي'),
+ (0xFDB2, 'M', u'قمي'),
+ (0xFDB3, 'M', u'نحي'),
+ (0xFDB4, 'M', u'قمح'),
+ (0xFDB5, 'M', u'لحم'),
+ (0xFDB6, 'M', u'عمي'),
+ (0xFDB7, 'M', u'كمي'),
+ (0xFDB8, 'M', u'نجح'),
+ (0xFDB9, 'M', u'مخي'),
+ (0xFDBA, 'M', u'لجم'),
+ (0xFDBB, 'M', u'كمم'),
+ (0xFDBC, 'M', u'لجم'),
+ (0xFDBD, 'M', u'نجح'),
+ (0xFDBE, 'M', u'جحي'),
+ (0xFDBF, 'M', u'حجي'),
+ (0xFDC0, 'M', u'مجي'),
+ (0xFDC1, 'M', u'فمي'),
+ (0xFDC2, 'M', u'بحي'),
+ (0xFDC3, 'M', u'كمم'),
+ (0xFDC4, 'M', u'عجم'),
+ (0xFDC5, 'M', u'صمم'),
+ (0xFDC6, 'M', u'سخي'),
+ (0xFDC7, 'M', u'نجي'),
+ (0xFDC8, 'X'),
+ (0xFDF0, 'M', u'صلے'),
+ (0xFDF1, 'M', u'قلے'),
+ (0xFDF2, 'M', u'الله'),
+ (0xFDF3, 'M', u'اكبر'),
+ (0xFDF4, 'M', u'محمد'),
+ (0xFDF5, 'M', u'صلعم'),
+ (0xFDF6, 'M', u'رسول'),
+ (0xFDF7, 'M', u'عليه'),
+ (0xFDF8, 'M', u'وسلم'),
+ (0xFDF9, 'M', u'صلى'),
+ (0xFDFA, '3', u'صلى الله عليه وسلم'),
+ (0xFDFB, '3', u'جل جلاله'),
+ (0xFDFC, 'M', u'ریال'),
+ (0xFDFD, 'V'),
+ (0xFDFE, 'X'),
+ (0xFE00, 'I'),
+ (0xFE10, '3', u','),
+ (0xFE11, 'M', u'、'),
+ (0xFE12, 'X'),
+ (0xFE13, '3', u':'),
+ (0xFE14, '3', u';'),
+ (0xFE15, '3', u'!'),
+ (0xFE16, '3', u'?'),
+ (0xFE17, 'M', u'〖'),
+ (0xFE18, 'M', u'〗'),
+ (0xFE19, 'X'),
+ (0xFE20, 'V'),
+ (0xFE30, 'X'),
+ (0xFE31, 'M', u'—'),
+ (0xFE32, 'M', u'–'),
+ (0xFE33, '3', u'_'),
+ (0xFE35, '3', u'('),
+ (0xFE36, '3', u')'),
+ (0xFE37, '3', u'{'),
+ (0xFE38, '3', u'}'),
+ (0xFE39, 'M', u'〔'),
+ (0xFE3A, 'M', u'〕'),
+ (0xFE3B, 'M', u'【'),
+ (0xFE3C, 'M', u'】'),
+ (0xFE3D, 'M', u'《'),
+ (0xFE3E, 'M', u'》'),
+ (0xFE3F, 'M', u'〈'),
+ (0xFE40, 'M', u'〉'),
+ (0xFE41, 'M', u'「'),
+ (0xFE42, 'M', u'」'),
+ (0xFE43, 'M', u'『'),
+ (0xFE44, 'M', u'』'),
+ (0xFE45, 'V'),
+ (0xFE47, '3', u'['),
+ (0xFE48, '3', u']'),
+ (0xFE49, '3', u' ̅'),
+ (0xFE4D, '3', u'_'),
+ (0xFE50, '3', u','),
+ (0xFE51, 'M', u'、'),
+ (0xFE52, 'X'),
+ (0xFE54, '3', u';'),
+ (0xFE55, '3', u':'),
+ (0xFE56, '3', u'?'),
+ (0xFE57, '3', u'!'),
+ (0xFE58, 'M', u'—'),
+ (0xFE59, '3', u'('),
+ (0xFE5A, '3', u')'),
+ (0xFE5B, '3', u'{'),
+ (0xFE5C, '3', u'}'),
+ (0xFE5D, 'M', u'〔'),
+ (0xFE5E, 'M', u'〕'),
+ (0xFE5F, '3', u'#'),
+ (0xFE60, '3', u'&'),
+ (0xFE61, '3', u'*'),
+ (0xFE62, '3', u'+'),
+ (0xFE63, 'M', u'-'),
+ (0xFE64, '3', u'<'),
+ (0xFE65, '3', u'>'),
+ (0xFE66, '3', u'='),
+ ]
+
+def _seg_50():
+ return [
+ (0xFE67, 'X'),
+ (0xFE68, '3', u'\\'),
+ (0xFE69, '3', u'$'),
+ (0xFE6A, '3', u'%'),
+ (0xFE6B, '3', u'@'),
+ (0xFE6C, 'X'),
+ (0xFE70, '3', u' ً'),
+ (0xFE71, 'M', u'ـً'),
+ (0xFE72, '3', u' ٌ'),
+ (0xFE73, 'V'),
+ (0xFE74, '3', u' ٍ'),
+ (0xFE75, 'X'),
+ (0xFE76, '3', u' َ'),
+ (0xFE77, 'M', u'ـَ'),
+ (0xFE78, '3', u' ُ'),
+ (0xFE79, 'M', u'ـُ'),
+ (0xFE7A, '3', u' ِ'),
+ (0xFE7B, 'M', u'ـِ'),
+ (0xFE7C, '3', u' ّ'),
+ (0xFE7D, 'M', u'ـّ'),
+ (0xFE7E, '3', u' ْ'),
+ (0xFE7F, 'M', u'ـْ'),
+ (0xFE80, 'M', u'ء'),
+ (0xFE81, 'M', u'آ'),
+ (0xFE83, 'M', u'أ'),
+ (0xFE85, 'M', u'ؤ'),
+ (0xFE87, 'M', u'إ'),
+ (0xFE89, 'M', u'ئ'),
+ (0xFE8D, 'M', u'ا'),
+ (0xFE8F, 'M', u'ب'),
+ (0xFE93, 'M', u'ة'),
+ (0xFE95, 'M', u'ت'),
+ (0xFE99, 'M', u'ث'),
+ (0xFE9D, 'M', u'ج'),
+ (0xFEA1, 'M', u'ح'),
+ (0xFEA5, 'M', u'خ'),
+ (0xFEA9, 'M', u'د'),
+ (0xFEAB, 'M', u'ذ'),
+ (0xFEAD, 'M', u'ر'),
+ (0xFEAF, 'M', u'ز'),
+ (0xFEB1, 'M', u'س'),
+ (0xFEB5, 'M', u'ش'),
+ (0xFEB9, 'M', u'ص'),
+ (0xFEBD, 'M', u'ض'),
+ (0xFEC1, 'M', u'ط'),
+ (0xFEC5, 'M', u'ظ'),
+ (0xFEC9, 'M', u'ع'),
+ (0xFECD, 'M', u'غ'),
+ (0xFED1, 'M', u'ف'),
+ (0xFED5, 'M', u'ق'),
+ (0xFED9, 'M', u'ك'),
+ (0xFEDD, 'M', u'ل'),
+ (0xFEE1, 'M', u'م'),
+ (0xFEE5, 'M', u'ن'),
+ (0xFEE9, 'M', u'ه'),
+ (0xFEED, 'M', u'و'),
+ (0xFEEF, 'M', u'ى'),
+ (0xFEF1, 'M', u'ي'),
+ (0xFEF5, 'M', u'لآ'),
+ (0xFEF7, 'M', u'لأ'),
+ (0xFEF9, 'M', u'لإ'),
+ (0xFEFB, 'M', u'لا'),
+ (0xFEFD, 'X'),
+ (0xFEFF, 'I'),
+ (0xFF00, 'X'),
+ (0xFF01, '3', u'!'),
+ (0xFF02, '3', u'"'),
+ (0xFF03, '3', u'#'),
+ (0xFF04, '3', u'$'),
+ (0xFF05, '3', u'%'),
+ (0xFF06, '3', u'&'),
+ (0xFF07, '3', u'\''),
+ (0xFF08, '3', u'('),
+ (0xFF09, '3', u')'),
+ (0xFF0A, '3', u'*'),
+ (0xFF0B, '3', u'+'),
+ (0xFF0C, '3', u','),
+ (0xFF0D, 'M', u'-'),
+ (0xFF0E, 'M', u'.'),
+ (0xFF0F, '3', u'/'),
+ (0xFF10, 'M', u'0'),
+ (0xFF11, 'M', u'1'),
+ (0xFF12, 'M', u'2'),
+ (0xFF13, 'M', u'3'),
+ (0xFF14, 'M', u'4'),
+ (0xFF15, 'M', u'5'),
+ (0xFF16, 'M', u'6'),
+ (0xFF17, 'M', u'7'),
+ (0xFF18, 'M', u'8'),
+ (0xFF19, 'M', u'9'),
+ (0xFF1A, '3', u':'),
+ (0xFF1B, '3', u';'),
+ (0xFF1C, '3', u'<'),
+ (0xFF1D, '3', u'='),
+ (0xFF1E, '3', u'>'),
+ (0xFF1F, '3', u'?'),
+ (0xFF20, '3', u'@'),
+ (0xFF21, 'M', u'a'),
+ (0xFF22, 'M', u'b'),
+ (0xFF23, 'M', u'c'),
+ ]
+
+def _seg_51():
+ return [
+ (0xFF24, 'M', u'd'),
+ (0xFF25, 'M', u'e'),
+ (0xFF26, 'M', u'f'),
+ (0xFF27, 'M', u'g'),
+ (0xFF28, 'M', u'h'),
+ (0xFF29, 'M', u'i'),
+ (0xFF2A, 'M', u'j'),
+ (0xFF2B, 'M', u'k'),
+ (0xFF2C, 'M', u'l'),
+ (0xFF2D, 'M', u'm'),
+ (0xFF2E, 'M', u'n'),
+ (0xFF2F, 'M', u'o'),
+ (0xFF30, 'M', u'p'),
+ (0xFF31, 'M', u'q'),
+ (0xFF32, 'M', u'r'),
+ (0xFF33, 'M', u's'),
+ (0xFF34, 'M', u't'),
+ (0xFF35, 'M', u'u'),
+ (0xFF36, 'M', u'v'),
+ (0xFF37, 'M', u'w'),
+ (0xFF38, 'M', u'x'),
+ (0xFF39, 'M', u'y'),
+ (0xFF3A, 'M', u'z'),
+ (0xFF3B, '3', u'['),
+ (0xFF3C, '3', u'\\'),
+ (0xFF3D, '3', u']'),
+ (0xFF3E, '3', u'^'),
+ (0xFF3F, '3', u'_'),
+ (0xFF40, '3', u'`'),
+ (0xFF41, 'M', u'a'),
+ (0xFF42, 'M', u'b'),
+ (0xFF43, 'M', u'c'),
+ (0xFF44, 'M', u'd'),
+ (0xFF45, 'M', u'e'),
+ (0xFF46, 'M', u'f'),
+ (0xFF47, 'M', u'g'),
+ (0xFF48, 'M', u'h'),
+ (0xFF49, 'M', u'i'),
+ (0xFF4A, 'M', u'j'),
+ (0xFF4B, 'M', u'k'),
+ (0xFF4C, 'M', u'l'),
+ (0xFF4D, 'M', u'm'),
+ (0xFF4E, 'M', u'n'),
+ (0xFF4F, 'M', u'o'),
+ (0xFF50, 'M', u'p'),
+ (0xFF51, 'M', u'q'),
+ (0xFF52, 'M', u'r'),
+ (0xFF53, 'M', u's'),
+ (0xFF54, 'M', u't'),
+ (0xFF55, 'M', u'u'),
+ (0xFF56, 'M', u'v'),
+ (0xFF57, 'M', u'w'),
+ (0xFF58, 'M', u'x'),
+ (0xFF59, 'M', u'y'),
+ (0xFF5A, 'M', u'z'),
+ (0xFF5B, '3', u'{'),
+ (0xFF5C, '3', u'|'),
+ (0xFF5D, '3', u'}'),
+ (0xFF5E, '3', u'~'),
+ (0xFF5F, 'M', u'⦅'),
+ (0xFF60, 'M', u'⦆'),
+ (0xFF61, 'M', u'.'),
+ (0xFF62, 'M', u'「'),
+ (0xFF63, 'M', u'」'),
+ (0xFF64, 'M', u'、'),
+ (0xFF65, 'M', u'・'),
+ (0xFF66, 'M', u'ヲ'),
+ (0xFF67, 'M', u'ァ'),
+ (0xFF68, 'M', u'ィ'),
+ (0xFF69, 'M', u'ゥ'),
+ (0xFF6A, 'M', u'ェ'),
+ (0xFF6B, 'M', u'ォ'),
+ (0xFF6C, 'M', u'ャ'),
+ (0xFF6D, 'M', u'ュ'),
+ (0xFF6E, 'M', u'ョ'),
+ (0xFF6F, 'M', u'ッ'),
+ (0xFF70, 'M', u'ー'),
+ (0xFF71, 'M', u'ア'),
+ (0xFF72, 'M', u'イ'),
+ (0xFF73, 'M', u'ウ'),
+ (0xFF74, 'M', u'エ'),
+ (0xFF75, 'M', u'オ'),
+ (0xFF76, 'M', u'カ'),
+ (0xFF77, 'M', u'キ'),
+ (0xFF78, 'M', u'ク'),
+ (0xFF79, 'M', u'ケ'),
+ (0xFF7A, 'M', u'コ'),
+ (0xFF7B, 'M', u'サ'),
+ (0xFF7C, 'M', u'シ'),
+ (0xFF7D, 'M', u'ス'),
+ (0xFF7E, 'M', u'セ'),
+ (0xFF7F, 'M', u'ソ'),
+ (0xFF80, 'M', u'タ'),
+ (0xFF81, 'M', u'チ'),
+ (0xFF82, 'M', u'ツ'),
+ (0xFF83, 'M', u'テ'),
+ (0xFF84, 'M', u'ト'),
+ (0xFF85, 'M', u'ナ'),
+ (0xFF86, 'M', u'ニ'),
+ (0xFF87, 'M', u'ヌ'),
+ ]
+
+def _seg_52():
+ return [
+ (0xFF88, 'M', u'ネ'),
+ (0xFF89, 'M', u'ノ'),
+ (0xFF8A, 'M', u'ハ'),
+ (0xFF8B, 'M', u'ヒ'),
+ (0xFF8C, 'M', u'フ'),
+ (0xFF8D, 'M', u'ヘ'),
+ (0xFF8E, 'M', u'ホ'),
+ (0xFF8F, 'M', u'マ'),
+ (0xFF90, 'M', u'ミ'),
+ (0xFF91, 'M', u'ム'),
+ (0xFF92, 'M', u'メ'),
+ (0xFF93, 'M', u'モ'),
+ (0xFF94, 'M', u'ヤ'),
+ (0xFF95, 'M', u'ユ'),
+ (0xFF96, 'M', u'ヨ'),
+ (0xFF97, 'M', u'ラ'),
+ (0xFF98, 'M', u'リ'),
+ (0xFF99, 'M', u'ル'),
+ (0xFF9A, 'M', u'レ'),
+ (0xFF9B, 'M', u'ロ'),
+ (0xFF9C, 'M', u'ワ'),
+ (0xFF9D, 'M', u'ン'),
+ (0xFF9E, 'M', u'゙'),
+ (0xFF9F, 'M', u'゚'),
+ (0xFFA0, 'X'),
+ (0xFFA1, 'M', u'ᄀ'),
+ (0xFFA2, 'M', u'ᄁ'),
+ (0xFFA3, 'M', u'ᆪ'),
+ (0xFFA4, 'M', u'ᄂ'),
+ (0xFFA5, 'M', u'ᆬ'),
+ (0xFFA6, 'M', u'ᆭ'),
+ (0xFFA7, 'M', u'ᄃ'),
+ (0xFFA8, 'M', u'ᄄ'),
+ (0xFFA9, 'M', u'ᄅ'),
+ (0xFFAA, 'M', u'ᆰ'),
+ (0xFFAB, 'M', u'ᆱ'),
+ (0xFFAC, 'M', u'ᆲ'),
+ (0xFFAD, 'M', u'ᆳ'),
+ (0xFFAE, 'M', u'ᆴ'),
+ (0xFFAF, 'M', u'ᆵ'),
+ (0xFFB0, 'M', u'ᄚ'),
+ (0xFFB1, 'M', u'ᄆ'),
+ (0xFFB2, 'M', u'ᄇ'),
+ (0xFFB3, 'M', u'ᄈ'),
+ (0xFFB4, 'M', u'ᄡ'),
+ (0xFFB5, 'M', u'ᄉ'),
+ (0xFFB6, 'M', u'ᄊ'),
+ (0xFFB7, 'M', u'ᄋ'),
+ (0xFFB8, 'M', u'ᄌ'),
+ (0xFFB9, 'M', u'ᄍ'),
+ (0xFFBA, 'M', u'ᄎ'),
+ (0xFFBB, 'M', u'ᄏ'),
+ (0xFFBC, 'M', u'ᄐ'),
+ (0xFFBD, 'M', u'ᄑ'),
+ (0xFFBE, 'M', u'ᄒ'),
+ (0xFFBF, 'X'),
+ (0xFFC2, 'M', u'ᅡ'),
+ (0xFFC3, 'M', u'ᅢ'),
+ (0xFFC4, 'M', u'ᅣ'),
+ (0xFFC5, 'M', u'ᅤ'),
+ (0xFFC6, 'M', u'ᅥ'),
+ (0xFFC7, 'M', u'ᅦ'),
+ (0xFFC8, 'X'),
+ (0xFFCA, 'M', u'ᅧ'),
+ (0xFFCB, 'M', u'ᅨ'),
+ (0xFFCC, 'M', u'ᅩ'),
+ (0xFFCD, 'M', u'ᅪ'),
+ (0xFFCE, 'M', u'ᅫ'),
+ (0xFFCF, 'M', u'ᅬ'),
+ (0xFFD0, 'X'),
+ (0xFFD2, 'M', u'ᅭ'),
+ (0xFFD3, 'M', u'ᅮ'),
+ (0xFFD4, 'M', u'ᅯ'),
+ (0xFFD5, 'M', u'ᅰ'),
+ (0xFFD6, 'M', u'ᅱ'),
+ (0xFFD7, 'M', u'ᅲ'),
+ (0xFFD8, 'X'),
+ (0xFFDA, 'M', u'ᅳ'),
+ (0xFFDB, 'M', u'ᅴ'),
+ (0xFFDC, 'M', u'ᅵ'),
+ (0xFFDD, 'X'),
+ (0xFFE0, 'M', u'¢'),
+ (0xFFE1, 'M', u'£'),
+ (0xFFE2, 'M', u'¬'),
+ (0xFFE3, '3', u' ̄'),
+ (0xFFE4, 'M', u'¦'),
+ (0xFFE5, 'M', u'¥'),
+ (0xFFE6, 'M', u'₩'),
+ (0xFFE7, 'X'),
+ (0xFFE8, 'M', u'│'),
+ (0xFFE9, 'M', u'←'),
+ (0xFFEA, 'M', u'↑'),
+ (0xFFEB, 'M', u'→'),
+ (0xFFEC, 'M', u'↓'),
+ (0xFFED, 'M', u'■'),
+ (0xFFEE, 'M', u'○'),
+ (0xFFEF, 'X'),
+ (0x10000, 'V'),
+ (0x1000C, 'X'),
+ (0x1000D, 'V'),
+ ]
+
+def _seg_53():
+ return [
+ (0x10027, 'X'),
+ (0x10028, 'V'),
+ (0x1003B, 'X'),
+ (0x1003C, 'V'),
+ (0x1003E, 'X'),
+ (0x1003F, 'V'),
+ (0x1004E, 'X'),
+ (0x10050, 'V'),
+ (0x1005E, 'X'),
+ (0x10080, 'V'),
+ (0x100FB, 'X'),
+ (0x10100, 'V'),
+ (0x10103, 'X'),
+ (0x10107, 'V'),
+ (0x10134, 'X'),
+ (0x10137, 'V'),
+ (0x1018F, 'X'),
+ (0x10190, 'V'),
+ (0x1019D, 'X'),
+ (0x101A0, 'V'),
+ (0x101A1, 'X'),
+ (0x101D0, 'V'),
+ (0x101FE, 'X'),
+ (0x10280, 'V'),
+ (0x1029D, 'X'),
+ (0x102A0, 'V'),
+ (0x102D1, 'X'),
+ (0x102E0, 'V'),
+ (0x102FC, 'X'),
+ (0x10300, 'V'),
+ (0x10324, 'X'),
+ (0x1032D, 'V'),
+ (0x1034B, 'X'),
+ (0x10350, 'V'),
+ (0x1037B, 'X'),
+ (0x10380, 'V'),
+ (0x1039E, 'X'),
+ (0x1039F, 'V'),
+ (0x103C4, 'X'),
+ (0x103C8, 'V'),
+ (0x103D6, 'X'),
+ (0x10400, 'M', u'𐐨'),
+ (0x10401, 'M', u'𐐩'),
+ (0x10402, 'M', u'𐐪'),
+ (0x10403, 'M', u'𐐫'),
+ (0x10404, 'M', u'𐐬'),
+ (0x10405, 'M', u'𐐭'),
+ (0x10406, 'M', u'𐐮'),
+ (0x10407, 'M', u'𐐯'),
+ (0x10408, 'M', u'𐐰'),
+ (0x10409, 'M', u'𐐱'),
+ (0x1040A, 'M', u'𐐲'),
+ (0x1040B, 'M', u'𐐳'),
+ (0x1040C, 'M', u'𐐴'),
+ (0x1040D, 'M', u'𐐵'),
+ (0x1040E, 'M', u'𐐶'),
+ (0x1040F, 'M', u'𐐷'),
+ (0x10410, 'M', u'𐐸'),
+ (0x10411, 'M', u'𐐹'),
+ (0x10412, 'M', u'𐐺'),
+ (0x10413, 'M', u'𐐻'),
+ (0x10414, 'M', u'𐐼'),
+ (0x10415, 'M', u'𐐽'),
+ (0x10416, 'M', u'𐐾'),
+ (0x10417, 'M', u'𐐿'),
+ (0x10418, 'M', u'𐑀'),
+ (0x10419, 'M', u'𐑁'),
+ (0x1041A, 'M', u'𐑂'),
+ (0x1041B, 'M', u'𐑃'),
+ (0x1041C, 'M', u'𐑄'),
+ (0x1041D, 'M', u'𐑅'),
+ (0x1041E, 'M', u'𐑆'),
+ (0x1041F, 'M', u'𐑇'),
+ (0x10420, 'M', u'𐑈'),
+ (0x10421, 'M', u'𐑉'),
+ (0x10422, 'M', u'𐑊'),
+ (0x10423, 'M', u'𐑋'),
+ (0x10424, 'M', u'𐑌'),
+ (0x10425, 'M', u'𐑍'),
+ (0x10426, 'M', u'𐑎'),
+ (0x10427, 'M', u'𐑏'),
+ (0x10428, 'V'),
+ (0x1049E, 'X'),
+ (0x104A0, 'V'),
+ (0x104AA, 'X'),
+ (0x104B0, 'M', u'𐓘'),
+ (0x104B1, 'M', u'𐓙'),
+ (0x104B2, 'M', u'𐓚'),
+ (0x104B3, 'M', u'𐓛'),
+ (0x104B4, 'M', u'𐓜'),
+ (0x104B5, 'M', u'𐓝'),
+ (0x104B6, 'M', u'𐓞'),
+ (0x104B7, 'M', u'𐓟'),
+ (0x104B8, 'M', u'𐓠'),
+ (0x104B9, 'M', u'𐓡'),
+ (0x104BA, 'M', u'𐓢'),
+ (0x104BB, 'M', u'𐓣'),
+ (0x104BC, 'M', u'𐓤'),
+ (0x104BD, 'M', u'𐓥'),
+ (0x104BE, 'M', u'𐓦'),
+ ]
+
+def _seg_54():
+ return [
+ (0x104BF, 'M', u'𐓧'),
+ (0x104C0, 'M', u'𐓨'),
+ (0x104C1, 'M', u'𐓩'),
+ (0x104C2, 'M', u'𐓪'),
+ (0x104C3, 'M', u'𐓫'),
+ (0x104C4, 'M', u'𐓬'),
+ (0x104C5, 'M', u'𐓭'),
+ (0x104C6, 'M', u'𐓮'),
+ (0x104C7, 'M', u'𐓯'),
+ (0x104C8, 'M', u'𐓰'),
+ (0x104C9, 'M', u'𐓱'),
+ (0x104CA, 'M', u'𐓲'),
+ (0x104CB, 'M', u'𐓳'),
+ (0x104CC, 'M', u'𐓴'),
+ (0x104CD, 'M', u'𐓵'),
+ (0x104CE, 'M', u'𐓶'),
+ (0x104CF, 'M', u'𐓷'),
+ (0x104D0, 'M', u'𐓸'),
+ (0x104D1, 'M', u'𐓹'),
+ (0x104D2, 'M', u'𐓺'),
+ (0x104D3, 'M', u'𐓻'),
+ (0x104D4, 'X'),
+ (0x104D8, 'V'),
+ (0x104FC, 'X'),
+ (0x10500, 'V'),
+ (0x10528, 'X'),
+ (0x10530, 'V'),
+ (0x10564, 'X'),
+ (0x1056F, 'V'),
+ (0x10570, 'X'),
+ (0x10600, 'V'),
+ (0x10737, 'X'),
+ (0x10740, 'V'),
+ (0x10756, 'X'),
+ (0x10760, 'V'),
+ (0x10768, 'X'),
+ (0x10800, 'V'),
+ (0x10806, 'X'),
+ (0x10808, 'V'),
+ (0x10809, 'X'),
+ (0x1080A, 'V'),
+ (0x10836, 'X'),
+ (0x10837, 'V'),
+ (0x10839, 'X'),
+ (0x1083C, 'V'),
+ (0x1083D, 'X'),
+ (0x1083F, 'V'),
+ (0x10856, 'X'),
+ (0x10857, 'V'),
+ (0x1089F, 'X'),
+ (0x108A7, 'V'),
+ (0x108B0, 'X'),
+ (0x108E0, 'V'),
+ (0x108F3, 'X'),
+ (0x108F4, 'V'),
+ (0x108F6, 'X'),
+ (0x108FB, 'V'),
+ (0x1091C, 'X'),
+ (0x1091F, 'V'),
+ (0x1093A, 'X'),
+ (0x1093F, 'V'),
+ (0x10940, 'X'),
+ (0x10980, 'V'),
+ (0x109B8, 'X'),
+ (0x109BC, 'V'),
+ (0x109D0, 'X'),
+ (0x109D2, 'V'),
+ (0x10A04, 'X'),
+ (0x10A05, 'V'),
+ (0x10A07, 'X'),
+ (0x10A0C, 'V'),
+ (0x10A14, 'X'),
+ (0x10A15, 'V'),
+ (0x10A18, 'X'),
+ (0x10A19, 'V'),
+ (0x10A36, 'X'),
+ (0x10A38, 'V'),
+ (0x10A3B, 'X'),
+ (0x10A3F, 'V'),
+ (0x10A49, 'X'),
+ (0x10A50, 'V'),
+ (0x10A59, 'X'),
+ (0x10A60, 'V'),
+ (0x10AA0, 'X'),
+ (0x10AC0, 'V'),
+ (0x10AE7, 'X'),
+ (0x10AEB, 'V'),
+ (0x10AF7, 'X'),
+ (0x10B00, 'V'),
+ (0x10B36, 'X'),
+ (0x10B39, 'V'),
+ (0x10B56, 'X'),
+ (0x10B58, 'V'),
+ (0x10B73, 'X'),
+ (0x10B78, 'V'),
+ (0x10B92, 'X'),
+ (0x10B99, 'V'),
+ (0x10B9D, 'X'),
+ (0x10BA9, 'V'),
+ (0x10BB0, 'X'),
+ ]
+
+def _seg_55():
+ return [
+ (0x10C00, 'V'),
+ (0x10C49, 'X'),
+ (0x10C80, 'M', u'𐳀'),
+ (0x10C81, 'M', u'𐳁'),
+ (0x10C82, 'M', u'𐳂'),
+ (0x10C83, 'M', u'𐳃'),
+ (0x10C84, 'M', u'𐳄'),
+ (0x10C85, 'M', u'𐳅'),
+ (0x10C86, 'M', u'𐳆'),
+ (0x10C87, 'M', u'𐳇'),
+ (0x10C88, 'M', u'𐳈'),
+ (0x10C89, 'M', u'𐳉'),
+ (0x10C8A, 'M', u'𐳊'),
+ (0x10C8B, 'M', u'𐳋'),
+ (0x10C8C, 'M', u'𐳌'),
+ (0x10C8D, 'M', u'𐳍'),
+ (0x10C8E, 'M', u'𐳎'),
+ (0x10C8F, 'M', u'𐳏'),
+ (0x10C90, 'M', u'𐳐'),
+ (0x10C91, 'M', u'𐳑'),
+ (0x10C92, 'M', u'𐳒'),
+ (0x10C93, 'M', u'𐳓'),
+ (0x10C94, 'M', u'𐳔'),
+ (0x10C95, 'M', u'𐳕'),
+ (0x10C96, 'M', u'𐳖'),
+ (0x10C97, 'M', u'𐳗'),
+ (0x10C98, 'M', u'𐳘'),
+ (0x10C99, 'M', u'𐳙'),
+ (0x10C9A, 'M', u'𐳚'),
+ (0x10C9B, 'M', u'𐳛'),
+ (0x10C9C, 'M', u'𐳜'),
+ (0x10C9D, 'M', u'𐳝'),
+ (0x10C9E, 'M', u'𐳞'),
+ (0x10C9F, 'M', u'𐳟'),
+ (0x10CA0, 'M', u'𐳠'),
+ (0x10CA1, 'M', u'𐳡'),
+ (0x10CA2, 'M', u'𐳢'),
+ (0x10CA3, 'M', u'𐳣'),
+ (0x10CA4, 'M', u'𐳤'),
+ (0x10CA5, 'M', u'𐳥'),
+ (0x10CA6, 'M', u'𐳦'),
+ (0x10CA7, 'M', u'𐳧'),
+ (0x10CA8, 'M', u'𐳨'),
+ (0x10CA9, 'M', u'𐳩'),
+ (0x10CAA, 'M', u'𐳪'),
+ (0x10CAB, 'M', u'𐳫'),
+ (0x10CAC, 'M', u'𐳬'),
+ (0x10CAD, 'M', u'𐳭'),
+ (0x10CAE, 'M', u'𐳮'),
+ (0x10CAF, 'M', u'𐳯'),
+ (0x10CB0, 'M', u'𐳰'),
+ (0x10CB1, 'M', u'𐳱'),
+ (0x10CB2, 'M', u'𐳲'),
+ (0x10CB3, 'X'),
+ (0x10CC0, 'V'),
+ (0x10CF3, 'X'),
+ (0x10CFA, 'V'),
+ (0x10D28, 'X'),
+ (0x10D30, 'V'),
+ (0x10D3A, 'X'),
+ (0x10E60, 'V'),
+ (0x10E7F, 'X'),
+ (0x10E80, 'V'),
+ (0x10EAA, 'X'),
+ (0x10EAB, 'V'),
+ (0x10EAE, 'X'),
+ (0x10EB0, 'V'),
+ (0x10EB2, 'X'),
+ (0x10F00, 'V'),
+ (0x10F28, 'X'),
+ (0x10F30, 'V'),
+ (0x10F5A, 'X'),
+ (0x10FB0, 'V'),
+ (0x10FCC, 'X'),
+ (0x10FE0, 'V'),
+ (0x10FF7, 'X'),
+ (0x11000, 'V'),
+ (0x1104E, 'X'),
+ (0x11052, 'V'),
+ (0x11070, 'X'),
+ (0x1107F, 'V'),
+ (0x110BD, 'X'),
+ (0x110BE, 'V'),
+ (0x110C2, 'X'),
+ (0x110D0, 'V'),
+ (0x110E9, 'X'),
+ (0x110F0, 'V'),
+ (0x110FA, 'X'),
+ (0x11100, 'V'),
+ (0x11135, 'X'),
+ (0x11136, 'V'),
+ (0x11148, 'X'),
+ (0x11150, 'V'),
+ (0x11177, 'X'),
+ (0x11180, 'V'),
+ (0x111E0, 'X'),
+ (0x111E1, 'V'),
+ (0x111F5, 'X'),
+ (0x11200, 'V'),
+ (0x11212, 'X'),
+ ]
+
+def _seg_56():
+ return [
+ (0x11213, 'V'),
+ (0x1123F, 'X'),
+ (0x11280, 'V'),
+ (0x11287, 'X'),
+ (0x11288, 'V'),
+ (0x11289, 'X'),
+ (0x1128A, 'V'),
+ (0x1128E, 'X'),
+ (0x1128F, 'V'),
+ (0x1129E, 'X'),
+ (0x1129F, 'V'),
+ (0x112AA, 'X'),
+ (0x112B0, 'V'),
+ (0x112EB, 'X'),
+ (0x112F0, 'V'),
+ (0x112FA, 'X'),
+ (0x11300, 'V'),
+ (0x11304, 'X'),
+ (0x11305, 'V'),
+ (0x1130D, 'X'),
+ (0x1130F, 'V'),
+ (0x11311, 'X'),
+ (0x11313, 'V'),
+ (0x11329, 'X'),
+ (0x1132A, 'V'),
+ (0x11331, 'X'),
+ (0x11332, 'V'),
+ (0x11334, 'X'),
+ (0x11335, 'V'),
+ (0x1133A, 'X'),
+ (0x1133B, 'V'),
+ (0x11345, 'X'),
+ (0x11347, 'V'),
+ (0x11349, 'X'),
+ (0x1134B, 'V'),
+ (0x1134E, 'X'),
+ (0x11350, 'V'),
+ (0x11351, 'X'),
+ (0x11357, 'V'),
+ (0x11358, 'X'),
+ (0x1135D, 'V'),
+ (0x11364, 'X'),
+ (0x11366, 'V'),
+ (0x1136D, 'X'),
+ (0x11370, 'V'),
+ (0x11375, 'X'),
+ (0x11400, 'V'),
+ (0x1145C, 'X'),
+ (0x1145D, 'V'),
+ (0x11462, 'X'),
+ (0x11480, 'V'),
+ (0x114C8, 'X'),
+ (0x114D0, 'V'),
+ (0x114DA, 'X'),
+ (0x11580, 'V'),
+ (0x115B6, 'X'),
+ (0x115B8, 'V'),
+ (0x115DE, 'X'),
+ (0x11600, 'V'),
+ (0x11645, 'X'),
+ (0x11650, 'V'),
+ (0x1165A, 'X'),
+ (0x11660, 'V'),
+ (0x1166D, 'X'),
+ (0x11680, 'V'),
+ (0x116B9, 'X'),
+ (0x116C0, 'V'),
+ (0x116CA, 'X'),
+ (0x11700, 'V'),
+ (0x1171B, 'X'),
+ (0x1171D, 'V'),
+ (0x1172C, 'X'),
+ (0x11730, 'V'),
+ (0x11740, 'X'),
+ (0x11800, 'V'),
+ (0x1183C, 'X'),
+ (0x118A0, 'M', u'𑣀'),
+ (0x118A1, 'M', u'𑣁'),
+ (0x118A2, 'M', u'𑣂'),
+ (0x118A3, 'M', u'𑣃'),
+ (0x118A4, 'M', u'𑣄'),
+ (0x118A5, 'M', u'𑣅'),
+ (0x118A6, 'M', u'𑣆'),
+ (0x118A7, 'M', u'𑣇'),
+ (0x118A8, 'M', u'𑣈'),
+ (0x118A9, 'M', u'𑣉'),
+ (0x118AA, 'M', u'𑣊'),
+ (0x118AB, 'M', u'𑣋'),
+ (0x118AC, 'M', u'𑣌'),
+ (0x118AD, 'M', u'𑣍'),
+ (0x118AE, 'M', u'𑣎'),
+ (0x118AF, 'M', u'𑣏'),
+ (0x118B0, 'M', u'𑣐'),
+ (0x118B1, 'M', u'𑣑'),
+ (0x118B2, 'M', u'𑣒'),
+ (0x118B3, 'M', u'𑣓'),
+ (0x118B4, 'M', u'𑣔'),
+ (0x118B5, 'M', u'𑣕'),
+ (0x118B6, 'M', u'𑣖'),
+ (0x118B7, 'M', u'𑣗'),
+ ]
+
+def _seg_57():
+ return [
+ (0x118B8, 'M', u'𑣘'),
+ (0x118B9, 'M', u'𑣙'),
+ (0x118BA, 'M', u'𑣚'),
+ (0x118BB, 'M', u'𑣛'),
+ (0x118BC, 'M', u'𑣜'),
+ (0x118BD, 'M', u'𑣝'),
+ (0x118BE, 'M', u'𑣞'),
+ (0x118BF, 'M', u'𑣟'),
+ (0x118C0, 'V'),
+ (0x118F3, 'X'),
+ (0x118FF, 'V'),
+ (0x11907, 'X'),
+ (0x11909, 'V'),
+ (0x1190A, 'X'),
+ (0x1190C, 'V'),
+ (0x11914, 'X'),
+ (0x11915, 'V'),
+ (0x11917, 'X'),
+ (0x11918, 'V'),
+ (0x11936, 'X'),
+ (0x11937, 'V'),
+ (0x11939, 'X'),
+ (0x1193B, 'V'),
+ (0x11947, 'X'),
+ (0x11950, 'V'),
+ (0x1195A, 'X'),
+ (0x119A0, 'V'),
+ (0x119A8, 'X'),
+ (0x119AA, 'V'),
+ (0x119D8, 'X'),
+ (0x119DA, 'V'),
+ (0x119E5, 'X'),
+ (0x11A00, 'V'),
+ (0x11A48, 'X'),
+ (0x11A50, 'V'),
+ (0x11AA3, 'X'),
+ (0x11AC0, 'V'),
+ (0x11AF9, 'X'),
+ (0x11C00, 'V'),
+ (0x11C09, 'X'),
+ (0x11C0A, 'V'),
+ (0x11C37, 'X'),
+ (0x11C38, 'V'),
+ (0x11C46, 'X'),
+ (0x11C50, 'V'),
+ (0x11C6D, 'X'),
+ (0x11C70, 'V'),
+ (0x11C90, 'X'),
+ (0x11C92, 'V'),
+ (0x11CA8, 'X'),
+ (0x11CA9, 'V'),
+ (0x11CB7, 'X'),
+ (0x11D00, 'V'),
+ (0x11D07, 'X'),
+ (0x11D08, 'V'),
+ (0x11D0A, 'X'),
+ (0x11D0B, 'V'),
+ (0x11D37, 'X'),
+ (0x11D3A, 'V'),
+ (0x11D3B, 'X'),
+ (0x11D3C, 'V'),
+ (0x11D3E, 'X'),
+ (0x11D3F, 'V'),
+ (0x11D48, 'X'),
+ (0x11D50, 'V'),
+ (0x11D5A, 'X'),
+ (0x11D60, 'V'),
+ (0x11D66, 'X'),
+ (0x11D67, 'V'),
+ (0x11D69, 'X'),
+ (0x11D6A, 'V'),
+ (0x11D8F, 'X'),
+ (0x11D90, 'V'),
+ (0x11D92, 'X'),
+ (0x11D93, 'V'),
+ (0x11D99, 'X'),
+ (0x11DA0, 'V'),
+ (0x11DAA, 'X'),
+ (0x11EE0, 'V'),
+ (0x11EF9, 'X'),
+ (0x11FB0, 'V'),
+ (0x11FB1, 'X'),
+ (0x11FC0, 'V'),
+ (0x11FF2, 'X'),
+ (0x11FFF, 'V'),
+ (0x1239A, 'X'),
+ (0x12400, 'V'),
+ (0x1246F, 'X'),
+ (0x12470, 'V'),
+ (0x12475, 'X'),
+ (0x12480, 'V'),
+ (0x12544, 'X'),
+ (0x13000, 'V'),
+ (0x1342F, 'X'),
+ (0x14400, 'V'),
+ (0x14647, 'X'),
+ (0x16800, 'V'),
+ (0x16A39, 'X'),
+ (0x16A40, 'V'),
+ (0x16A5F, 'X'),
+ ]
+
+def _seg_58():
+ return [
+ (0x16A60, 'V'),
+ (0x16A6A, 'X'),
+ (0x16A6E, 'V'),
+ (0x16A70, 'X'),
+ (0x16AD0, 'V'),
+ (0x16AEE, 'X'),
+ (0x16AF0, 'V'),
+ (0x16AF6, 'X'),
+ (0x16B00, 'V'),
+ (0x16B46, 'X'),
+ (0x16B50, 'V'),
+ (0x16B5A, 'X'),
+ (0x16B5B, 'V'),
+ (0x16B62, 'X'),
+ (0x16B63, 'V'),
+ (0x16B78, 'X'),
+ (0x16B7D, 'V'),
+ (0x16B90, 'X'),
+ (0x16E40, 'M', u'𖹠'),
+ (0x16E41, 'M', u'𖹡'),
+ (0x16E42, 'M', u'𖹢'),
+ (0x16E43, 'M', u'𖹣'),
+ (0x16E44, 'M', u'𖹤'),
+ (0x16E45, 'M', u'𖹥'),
+ (0x16E46, 'M', u'𖹦'),
+ (0x16E47, 'M', u'𖹧'),
+ (0x16E48, 'M', u'𖹨'),
+ (0x16E49, 'M', u'𖹩'),
+ (0x16E4A, 'M', u'𖹪'),
+ (0x16E4B, 'M', u'𖹫'),
+ (0x16E4C, 'M', u'𖹬'),
+ (0x16E4D, 'M', u'𖹭'),
+ (0x16E4E, 'M', u'𖹮'),
+ (0x16E4F, 'M', u'𖹯'),
+ (0x16E50, 'M', u'𖹰'),
+ (0x16E51, 'M', u'𖹱'),
+ (0x16E52, 'M', u'𖹲'),
+ (0x16E53, 'M', u'𖹳'),
+ (0x16E54, 'M', u'𖹴'),
+ (0x16E55, 'M', u'𖹵'),
+ (0x16E56, 'M', u'𖹶'),
+ (0x16E57, 'M', u'𖹷'),
+ (0x16E58, 'M', u'𖹸'),
+ (0x16E59, 'M', u'𖹹'),
+ (0x16E5A, 'M', u'𖹺'),
+ (0x16E5B, 'M', u'𖹻'),
+ (0x16E5C, 'M', u'𖹼'),
+ (0x16E5D, 'M', u'𖹽'),
+ (0x16E5E, 'M', u'𖹾'),
+ (0x16E5F, 'M', u'𖹿'),
+ (0x16E60, 'V'),
+ (0x16E9B, 'X'),
+ (0x16F00, 'V'),
+ (0x16F4B, 'X'),
+ (0x16F4F, 'V'),
+ (0x16F88, 'X'),
+ (0x16F8F, 'V'),
+ (0x16FA0, 'X'),
+ (0x16FE0, 'V'),
+ (0x16FE5, 'X'),
+ (0x16FF0, 'V'),
+ (0x16FF2, 'X'),
+ (0x17000, 'V'),
+ (0x187F8, 'X'),
+ (0x18800, 'V'),
+ (0x18CD6, 'X'),
+ (0x18D00, 'V'),
+ (0x18D09, 'X'),
+ (0x1B000, 'V'),
+ (0x1B11F, 'X'),
+ (0x1B150, 'V'),
+ (0x1B153, 'X'),
+ (0x1B164, 'V'),
+ (0x1B168, 'X'),
+ (0x1B170, 'V'),
+ (0x1B2FC, 'X'),
+ (0x1BC00, 'V'),
+ (0x1BC6B, 'X'),
+ (0x1BC70, 'V'),
+ (0x1BC7D, 'X'),
+ (0x1BC80, 'V'),
+ (0x1BC89, 'X'),
+ (0x1BC90, 'V'),
+ (0x1BC9A, 'X'),
+ (0x1BC9C, 'V'),
+ (0x1BCA0, 'I'),
+ (0x1BCA4, 'X'),
+ (0x1D000, 'V'),
+ (0x1D0F6, 'X'),
+ (0x1D100, 'V'),
+ (0x1D127, 'X'),
+ (0x1D129, 'V'),
+ (0x1D15E, 'M', u'𝅗𝅥'),
+ (0x1D15F, 'M', u'𝅘𝅥'),
+ (0x1D160, 'M', u'𝅘𝅥𝅮'),
+ (0x1D161, 'M', u'𝅘𝅥𝅯'),
+ (0x1D162, 'M', u'𝅘𝅥𝅰'),
+ (0x1D163, 'M', u'𝅘𝅥𝅱'),
+ (0x1D164, 'M', u'𝅘𝅥𝅲'),
+ (0x1D165, 'V'),
+ ]
+
+def _seg_59():
+ return [
+ (0x1D173, 'X'),
+ (0x1D17B, 'V'),
+ (0x1D1BB, 'M', u'𝆹𝅥'),
+ (0x1D1BC, 'M', u'𝆺𝅥'),
+ (0x1D1BD, 'M', u'𝆹𝅥𝅮'),
+ (0x1D1BE, 'M', u'𝆺𝅥𝅮'),
+ (0x1D1BF, 'M', u'𝆹𝅥𝅯'),
+ (0x1D1C0, 'M', u'𝆺𝅥𝅯'),
+ (0x1D1C1, 'V'),
+ (0x1D1E9, 'X'),
+ (0x1D200, 'V'),
+ (0x1D246, 'X'),
+ (0x1D2E0, 'V'),
+ (0x1D2F4, 'X'),
+ (0x1D300, 'V'),
+ (0x1D357, 'X'),
+ (0x1D360, 'V'),
+ (0x1D379, 'X'),
+ (0x1D400, 'M', u'a'),
+ (0x1D401, 'M', u'b'),
+ (0x1D402, 'M', u'c'),
+ (0x1D403, 'M', u'd'),
+ (0x1D404, 'M', u'e'),
+ (0x1D405, 'M', u'f'),
+ (0x1D406, 'M', u'g'),
+ (0x1D407, 'M', u'h'),
+ (0x1D408, 'M', u'i'),
+ (0x1D409, 'M', u'j'),
+ (0x1D40A, 'M', u'k'),
+ (0x1D40B, 'M', u'l'),
+ (0x1D40C, 'M', u'm'),
+ (0x1D40D, 'M', u'n'),
+ (0x1D40E, 'M', u'o'),
+ (0x1D40F, 'M', u'p'),
+ (0x1D410, 'M', u'q'),
+ (0x1D411, 'M', u'r'),
+ (0x1D412, 'M', u's'),
+ (0x1D413, 'M', u't'),
+ (0x1D414, 'M', u'u'),
+ (0x1D415, 'M', u'v'),
+ (0x1D416, 'M', u'w'),
+ (0x1D417, 'M', u'x'),
+ (0x1D418, 'M', u'y'),
+ (0x1D419, 'M', u'z'),
+ (0x1D41A, 'M', u'a'),
+ (0x1D41B, 'M', u'b'),
+ (0x1D41C, 'M', u'c'),
+ (0x1D41D, 'M', u'd'),
+ (0x1D41E, 'M', u'e'),
+ (0x1D41F, 'M', u'f'),
+ (0x1D420, 'M', u'g'),
+ (0x1D421, 'M', u'h'),
+ (0x1D422, 'M', u'i'),
+ (0x1D423, 'M', u'j'),
+ (0x1D424, 'M', u'k'),
+ (0x1D425, 'M', u'l'),
+ (0x1D426, 'M', u'm'),
+ (0x1D427, 'M', u'n'),
+ (0x1D428, 'M', u'o'),
+ (0x1D429, 'M', u'p'),
+ (0x1D42A, 'M', u'q'),
+ (0x1D42B, 'M', u'r'),
+ (0x1D42C, 'M', u's'),
+ (0x1D42D, 'M', u't'),
+ (0x1D42E, 'M', u'u'),
+ (0x1D42F, 'M', u'v'),
+ (0x1D430, 'M', u'w'),
+ (0x1D431, 'M', u'x'),
+ (0x1D432, 'M', u'y'),
+ (0x1D433, 'M', u'z'),
+ (0x1D434, 'M', u'a'),
+ (0x1D435, 'M', u'b'),
+ (0x1D436, 'M', u'c'),
+ (0x1D437, 'M', u'd'),
+ (0x1D438, 'M', u'e'),
+ (0x1D439, 'M', u'f'),
+ (0x1D43A, 'M', u'g'),
+ (0x1D43B, 'M', u'h'),
+ (0x1D43C, 'M', u'i'),
+ (0x1D43D, 'M', u'j'),
+ (0x1D43E, 'M', u'k'),
+ (0x1D43F, 'M', u'l'),
+ (0x1D440, 'M', u'm'),
+ (0x1D441, 'M', u'n'),
+ (0x1D442, 'M', u'o'),
+ (0x1D443, 'M', u'p'),
+ (0x1D444, 'M', u'q'),
+ (0x1D445, 'M', u'r'),
+ (0x1D446, 'M', u's'),
+ (0x1D447, 'M', u't'),
+ (0x1D448, 'M', u'u'),
+ (0x1D449, 'M', u'v'),
+ (0x1D44A, 'M', u'w'),
+ (0x1D44B, 'M', u'x'),
+ (0x1D44C, 'M', u'y'),
+ (0x1D44D, 'M', u'z'),
+ (0x1D44E, 'M', u'a'),
+ (0x1D44F, 'M', u'b'),
+ (0x1D450, 'M', u'c'),
+ (0x1D451, 'M', u'd'),
+ ]
+
+def _seg_60():
+ return [
+ (0x1D452, 'M', u'e'),
+ (0x1D453, 'M', u'f'),
+ (0x1D454, 'M', u'g'),
+ (0x1D455, 'X'),
+ (0x1D456, 'M', u'i'),
+ (0x1D457, 'M', u'j'),
+ (0x1D458, 'M', u'k'),
+ (0x1D459, 'M', u'l'),
+ (0x1D45A, 'M', u'm'),
+ (0x1D45B, 'M', u'n'),
+ (0x1D45C, 'M', u'o'),
+ (0x1D45D, 'M', u'p'),
+ (0x1D45E, 'M', u'q'),
+ (0x1D45F, 'M', u'r'),
+ (0x1D460, 'M', u's'),
+ (0x1D461, 'M', u't'),
+ (0x1D462, 'M', u'u'),
+ (0x1D463, 'M', u'v'),
+ (0x1D464, 'M', u'w'),
+ (0x1D465, 'M', u'x'),
+ (0x1D466, 'M', u'y'),
+ (0x1D467, 'M', u'z'),
+ (0x1D468, 'M', u'a'),
+ (0x1D469, 'M', u'b'),
+ (0x1D46A, 'M', u'c'),
+ (0x1D46B, 'M', u'd'),
+ (0x1D46C, 'M', u'e'),
+ (0x1D46D, 'M', u'f'),
+ (0x1D46E, 'M', u'g'),
+ (0x1D46F, 'M', u'h'),
+ (0x1D470, 'M', u'i'),
+ (0x1D471, 'M', u'j'),
+ (0x1D472, 'M', u'k'),
+ (0x1D473, 'M', u'l'),
+ (0x1D474, 'M', u'm'),
+ (0x1D475, 'M', u'n'),
+ (0x1D476, 'M', u'o'),
+ (0x1D477, 'M', u'p'),
+ (0x1D478, 'M', u'q'),
+ (0x1D479, 'M', u'r'),
+ (0x1D47A, 'M', u's'),
+ (0x1D47B, 'M', u't'),
+ (0x1D47C, 'M', u'u'),
+ (0x1D47D, 'M', u'v'),
+ (0x1D47E, 'M', u'w'),
+ (0x1D47F, 'M', u'x'),
+ (0x1D480, 'M', u'y'),
+ (0x1D481, 'M', u'z'),
+ (0x1D482, 'M', u'a'),
+ (0x1D483, 'M', u'b'),
+ (0x1D484, 'M', u'c'),
+ (0x1D485, 'M', u'd'),
+ (0x1D486, 'M', u'e'),
+ (0x1D487, 'M', u'f'),
+ (0x1D488, 'M', u'g'),
+ (0x1D489, 'M', u'h'),
+ (0x1D48A, 'M', u'i'),
+ (0x1D48B, 'M', u'j'),
+ (0x1D48C, 'M', u'k'),
+ (0x1D48D, 'M', u'l'),
+ (0x1D48E, 'M', u'm'),
+ (0x1D48F, 'M', u'n'),
+ (0x1D490, 'M', u'o'),
+ (0x1D491, 'M', u'p'),
+ (0x1D492, 'M', u'q'),
+ (0x1D493, 'M', u'r'),
+ (0x1D494, 'M', u's'),
+ (0x1D495, 'M', u't'),
+ (0x1D496, 'M', u'u'),
+ (0x1D497, 'M', u'v'),
+ (0x1D498, 'M', u'w'),
+ (0x1D499, 'M', u'x'),
+ (0x1D49A, 'M', u'y'),
+ (0x1D49B, 'M', u'z'),
+ (0x1D49C, 'M', u'a'),
+ (0x1D49D, 'X'),
+ (0x1D49E, 'M', u'c'),
+ (0x1D49F, 'M', u'd'),
+ (0x1D4A0, 'X'),
+ (0x1D4A2, 'M', u'g'),
+ (0x1D4A3, 'X'),
+ (0x1D4A5, 'M', u'j'),
+ (0x1D4A6, 'M', u'k'),
+ (0x1D4A7, 'X'),
+ (0x1D4A9, 'M', u'n'),
+ (0x1D4AA, 'M', u'o'),
+ (0x1D4AB, 'M', u'p'),
+ (0x1D4AC, 'M', u'q'),
+ (0x1D4AD, 'X'),
+ (0x1D4AE, 'M', u's'),
+ (0x1D4AF, 'M', u't'),
+ (0x1D4B0, 'M', u'u'),
+ (0x1D4B1, 'M', u'v'),
+ (0x1D4B2, 'M', u'w'),
+ (0x1D4B3, 'M', u'x'),
+ (0x1D4B4, 'M', u'y'),
+ (0x1D4B5, 'M', u'z'),
+ (0x1D4B6, 'M', u'a'),
+ (0x1D4B7, 'M', u'b'),
+ (0x1D4B8, 'M', u'c'),
+ ]
+
+def _seg_61():
+ return [
+ (0x1D4B9, 'M', u'd'),
+ (0x1D4BA, 'X'),
+ (0x1D4BB, 'M', u'f'),
+ (0x1D4BC, 'X'),
+ (0x1D4BD, 'M', u'h'),
+ (0x1D4BE, 'M', u'i'),
+ (0x1D4BF, 'M', u'j'),
+ (0x1D4C0, 'M', u'k'),
+ (0x1D4C1, 'M', u'l'),
+ (0x1D4C2, 'M', u'm'),
+ (0x1D4C3, 'M', u'n'),
+ (0x1D4C4, 'X'),
+ (0x1D4C5, 'M', u'p'),
+ (0x1D4C6, 'M', u'q'),
+ (0x1D4C7, 'M', u'r'),
+ (0x1D4C8, 'M', u's'),
+ (0x1D4C9, 'M', u't'),
+ (0x1D4CA, 'M', u'u'),
+ (0x1D4CB, 'M', u'v'),
+ (0x1D4CC, 'M', u'w'),
+ (0x1D4CD, 'M', u'x'),
+ (0x1D4CE, 'M', u'y'),
+ (0x1D4CF, 'M', u'z'),
+ (0x1D4D0, 'M', u'a'),
+ (0x1D4D1, 'M', u'b'),
+ (0x1D4D2, 'M', u'c'),
+ (0x1D4D3, 'M', u'd'),
+ (0x1D4D4, 'M', u'e'),
+ (0x1D4D5, 'M', u'f'),
+ (0x1D4D6, 'M', u'g'),
+ (0x1D4D7, 'M', u'h'),
+ (0x1D4D8, 'M', u'i'),
+ (0x1D4D9, 'M', u'j'),
+ (0x1D4DA, 'M', u'k'),
+ (0x1D4DB, 'M', u'l'),
+ (0x1D4DC, 'M', u'm'),
+ (0x1D4DD, 'M', u'n'),
+ (0x1D4DE, 'M', u'o'),
+ (0x1D4DF, 'M', u'p'),
+ (0x1D4E0, 'M', u'q'),
+ (0x1D4E1, 'M', u'r'),
+ (0x1D4E2, 'M', u's'),
+ (0x1D4E3, 'M', u't'),
+ (0x1D4E4, 'M', u'u'),
+ (0x1D4E5, 'M', u'v'),
+ (0x1D4E6, 'M', u'w'),
+ (0x1D4E7, 'M', u'x'),
+ (0x1D4E8, 'M', u'y'),
+ (0x1D4E9, 'M', u'z'),
+ (0x1D4EA, 'M', u'a'),
+ (0x1D4EB, 'M', u'b'),
+ (0x1D4EC, 'M', u'c'),
+ (0x1D4ED, 'M', u'd'),
+ (0x1D4EE, 'M', u'e'),
+ (0x1D4EF, 'M', u'f'),
+ (0x1D4F0, 'M', u'g'),
+ (0x1D4F1, 'M', u'h'),
+ (0x1D4F2, 'M', u'i'),
+ (0x1D4F3, 'M', u'j'),
+ (0x1D4F4, 'M', u'k'),
+ (0x1D4F5, 'M', u'l'),
+ (0x1D4F6, 'M', u'm'),
+ (0x1D4F7, 'M', u'n'),
+ (0x1D4F8, 'M', u'o'),
+ (0x1D4F9, 'M', u'p'),
+ (0x1D4FA, 'M', u'q'),
+ (0x1D4FB, 'M', u'r'),
+ (0x1D4FC, 'M', u's'),
+ (0x1D4FD, 'M', u't'),
+ (0x1D4FE, 'M', u'u'),
+ (0x1D4FF, 'M', u'v'),
+ (0x1D500, 'M', u'w'),
+ (0x1D501, 'M', u'x'),
+ (0x1D502, 'M', u'y'),
+ (0x1D503, 'M', u'z'),
+ (0x1D504, 'M', u'a'),
+ (0x1D505, 'M', u'b'),
+ (0x1D506, 'X'),
+ (0x1D507, 'M', u'd'),
+ (0x1D508, 'M', u'e'),
+ (0x1D509, 'M', u'f'),
+ (0x1D50A, 'M', u'g'),
+ (0x1D50B, 'X'),
+ (0x1D50D, 'M', u'j'),
+ (0x1D50E, 'M', u'k'),
+ (0x1D50F, 'M', u'l'),
+ (0x1D510, 'M', u'm'),
+ (0x1D511, 'M', u'n'),
+ (0x1D512, 'M', u'o'),
+ (0x1D513, 'M', u'p'),
+ (0x1D514, 'M', u'q'),
+ (0x1D515, 'X'),
+ (0x1D516, 'M', u's'),
+ (0x1D517, 'M', u't'),
+ (0x1D518, 'M', u'u'),
+ (0x1D519, 'M', u'v'),
+ (0x1D51A, 'M', u'w'),
+ (0x1D51B, 'M', u'x'),
+ (0x1D51C, 'M', u'y'),
+ (0x1D51D, 'X'),
+ ]
+
+def _seg_62():
+ return [
+ (0x1D51E, 'M', u'a'),
+ (0x1D51F, 'M', u'b'),
+ (0x1D520, 'M', u'c'),
+ (0x1D521, 'M', u'd'),
+ (0x1D522, 'M', u'e'),
+ (0x1D523, 'M', u'f'),
+ (0x1D524, 'M', u'g'),
+ (0x1D525, 'M', u'h'),
+ (0x1D526, 'M', u'i'),
+ (0x1D527, 'M', u'j'),
+ (0x1D528, 'M', u'k'),
+ (0x1D529, 'M', u'l'),
+ (0x1D52A, 'M', u'm'),
+ (0x1D52B, 'M', u'n'),
+ (0x1D52C, 'M', u'o'),
+ (0x1D52D, 'M', u'p'),
+ (0x1D52E, 'M', u'q'),
+ (0x1D52F, 'M', u'r'),
+ (0x1D530, 'M', u's'),
+ (0x1D531, 'M', u't'),
+ (0x1D532, 'M', u'u'),
+ (0x1D533, 'M', u'v'),
+ (0x1D534, 'M', u'w'),
+ (0x1D535, 'M', u'x'),
+ (0x1D536, 'M', u'y'),
+ (0x1D537, 'M', u'z'),
+ (0x1D538, 'M', u'a'),
+ (0x1D539, 'M', u'b'),
+ (0x1D53A, 'X'),
+ (0x1D53B, 'M', u'd'),
+ (0x1D53C, 'M', u'e'),
+ (0x1D53D, 'M', u'f'),
+ (0x1D53E, 'M', u'g'),
+ (0x1D53F, 'X'),
+ (0x1D540, 'M', u'i'),
+ (0x1D541, 'M', u'j'),
+ (0x1D542, 'M', u'k'),
+ (0x1D543, 'M', u'l'),
+ (0x1D544, 'M', u'm'),
+ (0x1D545, 'X'),
+ (0x1D546, 'M', u'o'),
+ (0x1D547, 'X'),
+ (0x1D54A, 'M', u's'),
+ (0x1D54B, 'M', u't'),
+ (0x1D54C, 'M', u'u'),
+ (0x1D54D, 'M', u'v'),
+ (0x1D54E, 'M', u'w'),
+ (0x1D54F, 'M', u'x'),
+ (0x1D550, 'M', u'y'),
+ (0x1D551, 'X'),
+ (0x1D552, 'M', u'a'),
+ (0x1D553, 'M', u'b'),
+ (0x1D554, 'M', u'c'),
+ (0x1D555, 'M', u'd'),
+ (0x1D556, 'M', u'e'),
+ (0x1D557, 'M', u'f'),
+ (0x1D558, 'M', u'g'),
+ (0x1D559, 'M', u'h'),
+ (0x1D55A, 'M', u'i'),
+ (0x1D55B, 'M', u'j'),
+ (0x1D55C, 'M', u'k'),
+ (0x1D55D, 'M', u'l'),
+ (0x1D55E, 'M', u'm'),
+ (0x1D55F, 'M', u'n'),
+ (0x1D560, 'M', u'o'),
+ (0x1D561, 'M', u'p'),
+ (0x1D562, 'M', u'q'),
+ (0x1D563, 'M', u'r'),
+ (0x1D564, 'M', u's'),
+ (0x1D565, 'M', u't'),
+ (0x1D566, 'M', u'u'),
+ (0x1D567, 'M', u'v'),
+ (0x1D568, 'M', u'w'),
+ (0x1D569, 'M', u'x'),
+ (0x1D56A, 'M', u'y'),
+ (0x1D56B, 'M', u'z'),
+ (0x1D56C, 'M', u'a'),
+ (0x1D56D, 'M', u'b'),
+ (0x1D56E, 'M', u'c'),
+ (0x1D56F, 'M', u'd'),
+ (0x1D570, 'M', u'e'),
+ (0x1D571, 'M', u'f'),
+ (0x1D572, 'M', u'g'),
+ (0x1D573, 'M', u'h'),
+ (0x1D574, 'M', u'i'),
+ (0x1D575, 'M', u'j'),
+ (0x1D576, 'M', u'k'),
+ (0x1D577, 'M', u'l'),
+ (0x1D578, 'M', u'm'),
+ (0x1D579, 'M', u'n'),
+ (0x1D57A, 'M', u'o'),
+ (0x1D57B, 'M', u'p'),
+ (0x1D57C, 'M', u'q'),
+ (0x1D57D, 'M', u'r'),
+ (0x1D57E, 'M', u's'),
+ (0x1D57F, 'M', u't'),
+ (0x1D580, 'M', u'u'),
+ (0x1D581, 'M', u'v'),
+ (0x1D582, 'M', u'w'),
+ (0x1D583, 'M', u'x'),
+ ]
+
+def _seg_63():
+ return [
+ (0x1D584, 'M', u'y'),
+ (0x1D585, 'M', u'z'),
+ (0x1D586, 'M', u'a'),
+ (0x1D587, 'M', u'b'),
+ (0x1D588, 'M', u'c'),
+ (0x1D589, 'M', u'd'),
+ (0x1D58A, 'M', u'e'),
+ (0x1D58B, 'M', u'f'),
+ (0x1D58C, 'M', u'g'),
+ (0x1D58D, 'M', u'h'),
+ (0x1D58E, 'M', u'i'),
+ (0x1D58F, 'M', u'j'),
+ (0x1D590, 'M', u'k'),
+ (0x1D591, 'M', u'l'),
+ (0x1D592, 'M', u'm'),
+ (0x1D593, 'M', u'n'),
+ (0x1D594, 'M', u'o'),
+ (0x1D595, 'M', u'p'),
+ (0x1D596, 'M', u'q'),
+ (0x1D597, 'M', u'r'),
+ (0x1D598, 'M', u's'),
+ (0x1D599, 'M', u't'),
+ (0x1D59A, 'M', u'u'),
+ (0x1D59B, 'M', u'v'),
+ (0x1D59C, 'M', u'w'),
+ (0x1D59D, 'M', u'x'),
+ (0x1D59E, 'M', u'y'),
+ (0x1D59F, 'M', u'z'),
+ (0x1D5A0, 'M', u'a'),
+ (0x1D5A1, 'M', u'b'),
+ (0x1D5A2, 'M', u'c'),
+ (0x1D5A3, 'M', u'd'),
+ (0x1D5A4, 'M', u'e'),
+ (0x1D5A5, 'M', u'f'),
+ (0x1D5A6, 'M', u'g'),
+ (0x1D5A7, 'M', u'h'),
+ (0x1D5A8, 'M', u'i'),
+ (0x1D5A9, 'M', u'j'),
+ (0x1D5AA, 'M', u'k'),
+ (0x1D5AB, 'M', u'l'),
+ (0x1D5AC, 'M', u'm'),
+ (0x1D5AD, 'M', u'n'),
+ (0x1D5AE, 'M', u'o'),
+ (0x1D5AF, 'M', u'p'),
+ (0x1D5B0, 'M', u'q'),
+ (0x1D5B1, 'M', u'r'),
+ (0x1D5B2, 'M', u's'),
+ (0x1D5B3, 'M', u't'),
+ (0x1D5B4, 'M', u'u'),
+ (0x1D5B5, 'M', u'v'),
+ (0x1D5B6, 'M', u'w'),
+ (0x1D5B7, 'M', u'x'),
+ (0x1D5B8, 'M', u'y'),
+ (0x1D5B9, 'M', u'z'),
+ (0x1D5BA, 'M', u'a'),
+ (0x1D5BB, 'M', u'b'),
+ (0x1D5BC, 'M', u'c'),
+ (0x1D5BD, 'M', u'd'),
+ (0x1D5BE, 'M', u'e'),
+ (0x1D5BF, 'M', u'f'),
+ (0x1D5C0, 'M', u'g'),
+ (0x1D5C1, 'M', u'h'),
+ (0x1D5C2, 'M', u'i'),
+ (0x1D5C3, 'M', u'j'),
+ (0x1D5C4, 'M', u'k'),
+ (0x1D5C5, 'M', u'l'),
+ (0x1D5C6, 'M', u'm'),
+ (0x1D5C7, 'M', u'n'),
+ (0x1D5C8, 'M', u'o'),
+ (0x1D5C9, 'M', u'p'),
+ (0x1D5CA, 'M', u'q'),
+ (0x1D5CB, 'M', u'r'),
+ (0x1D5CC, 'M', u's'),
+ (0x1D5CD, 'M', u't'),
+ (0x1D5CE, 'M', u'u'),
+ (0x1D5CF, 'M', u'v'),
+ (0x1D5D0, 'M', u'w'),
+ (0x1D5D1, 'M', u'x'),
+ (0x1D5D2, 'M', u'y'),
+ (0x1D5D3, 'M', u'z'),
+ (0x1D5D4, 'M', u'a'),
+ (0x1D5D5, 'M', u'b'),
+ (0x1D5D6, 'M', u'c'),
+ (0x1D5D7, 'M', u'd'),
+ (0x1D5D8, 'M', u'e'),
+ (0x1D5D9, 'M', u'f'),
+ (0x1D5DA, 'M', u'g'),
+ (0x1D5DB, 'M', u'h'),
+ (0x1D5DC, 'M', u'i'),
+ (0x1D5DD, 'M', u'j'),
+ (0x1D5DE, 'M', u'k'),
+ (0x1D5DF, 'M', u'l'),
+ (0x1D5E0, 'M', u'm'),
+ (0x1D5E1, 'M', u'n'),
+ (0x1D5E2, 'M', u'o'),
+ (0x1D5E3, 'M', u'p'),
+ (0x1D5E4, 'M', u'q'),
+ (0x1D5E5, 'M', u'r'),
+ (0x1D5E6, 'M', u's'),
+ (0x1D5E7, 'M', u't'),
+ ]
+
+def _seg_64():
+ return [
+ (0x1D5E8, 'M', u'u'),
+ (0x1D5E9, 'M', u'v'),
+ (0x1D5EA, 'M', u'w'),
+ (0x1D5EB, 'M', u'x'),
+ (0x1D5EC, 'M', u'y'),
+ (0x1D5ED, 'M', u'z'),
+ (0x1D5EE, 'M', u'a'),
+ (0x1D5EF, 'M', u'b'),
+ (0x1D5F0, 'M', u'c'),
+ (0x1D5F1, 'M', u'd'),
+ (0x1D5F2, 'M', u'e'),
+ (0x1D5F3, 'M', u'f'),
+ (0x1D5F4, 'M', u'g'),
+ (0x1D5F5, 'M', u'h'),
+ (0x1D5F6, 'M', u'i'),
+ (0x1D5F7, 'M', u'j'),
+ (0x1D5F8, 'M', u'k'),
+ (0x1D5F9, 'M', u'l'),
+ (0x1D5FA, 'M', u'm'),
+ (0x1D5FB, 'M', u'n'),
+ (0x1D5FC, 'M', u'o'),
+ (0x1D5FD, 'M', u'p'),
+ (0x1D5FE, 'M', u'q'),
+ (0x1D5FF, 'M', u'r'),
+ (0x1D600, 'M', u's'),
+ (0x1D601, 'M', u't'),
+ (0x1D602, 'M', u'u'),
+ (0x1D603, 'M', u'v'),
+ (0x1D604, 'M', u'w'),
+ (0x1D605, 'M', u'x'),
+ (0x1D606, 'M', u'y'),
+ (0x1D607, 'M', u'z'),
+ (0x1D608, 'M', u'a'),
+ (0x1D609, 'M', u'b'),
+ (0x1D60A, 'M', u'c'),
+ (0x1D60B, 'M', u'd'),
+ (0x1D60C, 'M', u'e'),
+ (0x1D60D, 'M', u'f'),
+ (0x1D60E, 'M', u'g'),
+ (0x1D60F, 'M', u'h'),
+ (0x1D610, 'M', u'i'),
+ (0x1D611, 'M', u'j'),
+ (0x1D612, 'M', u'k'),
+ (0x1D613, 'M', u'l'),
+ (0x1D614, 'M', u'm'),
+ (0x1D615, 'M', u'n'),
+ (0x1D616, 'M', u'o'),
+ (0x1D617, 'M', u'p'),
+ (0x1D618, 'M', u'q'),
+ (0x1D619, 'M', u'r'),
+ (0x1D61A, 'M', u's'),
+ (0x1D61B, 'M', u't'),
+ (0x1D61C, 'M', u'u'),
+ (0x1D61D, 'M', u'v'),
+ (0x1D61E, 'M', u'w'),
+ (0x1D61F, 'M', u'x'),
+ (0x1D620, 'M', u'y'),
+ (0x1D621, 'M', u'z'),
+ (0x1D622, 'M', u'a'),
+ (0x1D623, 'M', u'b'),
+ (0x1D624, 'M', u'c'),
+ (0x1D625, 'M', u'd'),
+ (0x1D626, 'M', u'e'),
+ (0x1D627, 'M', u'f'),
+ (0x1D628, 'M', u'g'),
+ (0x1D629, 'M', u'h'),
+ (0x1D62A, 'M', u'i'),
+ (0x1D62B, 'M', u'j'),
+ (0x1D62C, 'M', u'k'),
+ (0x1D62D, 'M', u'l'),
+ (0x1D62E, 'M', u'm'),
+ (0x1D62F, 'M', u'n'),
+ (0x1D630, 'M', u'o'),
+ (0x1D631, 'M', u'p'),
+ (0x1D632, 'M', u'q'),
+ (0x1D633, 'M', u'r'),
+ (0x1D634, 'M', u's'),
+ (0x1D635, 'M', u't'),
+ (0x1D636, 'M', u'u'),
+ (0x1D637, 'M', u'v'),
+ (0x1D638, 'M', u'w'),
+ (0x1D639, 'M', u'x'),
+ (0x1D63A, 'M', u'y'),
+ (0x1D63B, 'M', u'z'),
+ (0x1D63C, 'M', u'a'),
+ (0x1D63D, 'M', u'b'),
+ (0x1D63E, 'M', u'c'),
+ (0x1D63F, 'M', u'd'),
+ (0x1D640, 'M', u'e'),
+ (0x1D641, 'M', u'f'),
+ (0x1D642, 'M', u'g'),
+ (0x1D643, 'M', u'h'),
+ (0x1D644, 'M', u'i'),
+ (0x1D645, 'M', u'j'),
+ (0x1D646, 'M', u'k'),
+ (0x1D647, 'M', u'l'),
+ (0x1D648, 'M', u'm'),
+ (0x1D649, 'M', u'n'),
+ (0x1D64A, 'M', u'o'),
+ (0x1D64B, 'M', u'p'),
+ ]
+
+def _seg_65():
+ return [
+ (0x1D64C, 'M', u'q'),
+ (0x1D64D, 'M', u'r'),
+ (0x1D64E, 'M', u's'),
+ (0x1D64F, 'M', u't'),
+ (0x1D650, 'M', u'u'),
+ (0x1D651, 'M', u'v'),
+ (0x1D652, 'M', u'w'),
+ (0x1D653, 'M', u'x'),
+ (0x1D654, 'M', u'y'),
+ (0x1D655, 'M', u'z'),
+ (0x1D656, 'M', u'a'),
+ (0x1D657, 'M', u'b'),
+ (0x1D658, 'M', u'c'),
+ (0x1D659, 'M', u'd'),
+ (0x1D65A, 'M', u'e'),
+ (0x1D65B, 'M', u'f'),
+ (0x1D65C, 'M', u'g'),
+ (0x1D65D, 'M', u'h'),
+ (0x1D65E, 'M', u'i'),
+ (0x1D65F, 'M', u'j'),
+ (0x1D660, 'M', u'k'),
+ (0x1D661, 'M', u'l'),
+ (0x1D662, 'M', u'm'),
+ (0x1D663, 'M', u'n'),
+ (0x1D664, 'M', u'o'),
+ (0x1D665, 'M', u'p'),
+ (0x1D666, 'M', u'q'),
+ (0x1D667, 'M', u'r'),
+ (0x1D668, 'M', u's'),
+ (0x1D669, 'M', u't'),
+ (0x1D66A, 'M', u'u'),
+ (0x1D66B, 'M', u'v'),
+ (0x1D66C, 'M', u'w'),
+ (0x1D66D, 'M', u'x'),
+ (0x1D66E, 'M', u'y'),
+ (0x1D66F, 'M', u'z'),
+ (0x1D670, 'M', u'a'),
+ (0x1D671, 'M', u'b'),
+ (0x1D672, 'M', u'c'),
+ (0x1D673, 'M', u'd'),
+ (0x1D674, 'M', u'e'),
+ (0x1D675, 'M', u'f'),
+ (0x1D676, 'M', u'g'),
+ (0x1D677, 'M', u'h'),
+ (0x1D678, 'M', u'i'),
+ (0x1D679, 'M', u'j'),
+ (0x1D67A, 'M', u'k'),
+ (0x1D67B, 'M', u'l'),
+ (0x1D67C, 'M', u'm'),
+ (0x1D67D, 'M', u'n'),
+ (0x1D67E, 'M', u'o'),
+ (0x1D67F, 'M', u'p'),
+ (0x1D680, 'M', u'q'),
+ (0x1D681, 'M', u'r'),
+ (0x1D682, 'M', u's'),
+ (0x1D683, 'M', u't'),
+ (0x1D684, 'M', u'u'),
+ (0x1D685, 'M', u'v'),
+ (0x1D686, 'M', u'w'),
+ (0x1D687, 'M', u'x'),
+ (0x1D688, 'M', u'y'),
+ (0x1D689, 'M', u'z'),
+ (0x1D68A, 'M', u'a'),
+ (0x1D68B, 'M', u'b'),
+ (0x1D68C, 'M', u'c'),
+ (0x1D68D, 'M', u'd'),
+ (0x1D68E, 'M', u'e'),
+ (0x1D68F, 'M', u'f'),
+ (0x1D690, 'M', u'g'),
+ (0x1D691, 'M', u'h'),
+ (0x1D692, 'M', u'i'),
+ (0x1D693, 'M', u'j'),
+ (0x1D694, 'M', u'k'),
+ (0x1D695, 'M', u'l'),
+ (0x1D696, 'M', u'm'),
+ (0x1D697, 'M', u'n'),
+ (0x1D698, 'M', u'o'),
+ (0x1D699, 'M', u'p'),
+ (0x1D69A, 'M', u'q'),
+ (0x1D69B, 'M', u'r'),
+ (0x1D69C, 'M', u's'),
+ (0x1D69D, 'M', u't'),
+ (0x1D69E, 'M', u'u'),
+ (0x1D69F, 'M', u'v'),
+ (0x1D6A0, 'M', u'w'),
+ (0x1D6A1, 'M', u'x'),
+ (0x1D6A2, 'M', u'y'),
+ (0x1D6A3, 'M', u'z'),
+ (0x1D6A4, 'M', u'ı'),
+ (0x1D6A5, 'M', u'ȷ'),
+ (0x1D6A6, 'X'),
+ (0x1D6A8, 'M', u'α'),
+ (0x1D6A9, 'M', u'β'),
+ (0x1D6AA, 'M', u'γ'),
+ (0x1D6AB, 'M', u'δ'),
+ (0x1D6AC, 'M', u'ε'),
+ (0x1D6AD, 'M', u'ζ'),
+ (0x1D6AE, 'M', u'η'),
+ (0x1D6AF, 'M', u'θ'),
+ (0x1D6B0, 'M', u'ι'),
+ ]
+
+def _seg_66():
+ return [
+ (0x1D6B1, 'M', u'κ'),
+ (0x1D6B2, 'M', u'λ'),
+ (0x1D6B3, 'M', u'μ'),
+ (0x1D6B4, 'M', u'ν'),
+ (0x1D6B5, 'M', u'ξ'),
+ (0x1D6B6, 'M', u'ο'),
+ (0x1D6B7, 'M', u'π'),
+ (0x1D6B8, 'M', u'ρ'),
+ (0x1D6B9, 'M', u'θ'),
+ (0x1D6BA, 'M', u'σ'),
+ (0x1D6BB, 'M', u'τ'),
+ (0x1D6BC, 'M', u'υ'),
+ (0x1D6BD, 'M', u'φ'),
+ (0x1D6BE, 'M', u'χ'),
+ (0x1D6BF, 'M', u'ψ'),
+ (0x1D6C0, 'M', u'ω'),
+ (0x1D6C1, 'M', u'∇'),
+ (0x1D6C2, 'M', u'α'),
+ (0x1D6C3, 'M', u'β'),
+ (0x1D6C4, 'M', u'γ'),
+ (0x1D6C5, 'M', u'δ'),
+ (0x1D6C6, 'M', u'ε'),
+ (0x1D6C7, 'M', u'ζ'),
+ (0x1D6C8, 'M', u'η'),
+ (0x1D6C9, 'M', u'θ'),
+ (0x1D6CA, 'M', u'ι'),
+ (0x1D6CB, 'M', u'κ'),
+ (0x1D6CC, 'M', u'λ'),
+ (0x1D6CD, 'M', u'μ'),
+ (0x1D6CE, 'M', u'ν'),
+ (0x1D6CF, 'M', u'ξ'),
+ (0x1D6D0, 'M', u'ο'),
+ (0x1D6D1, 'M', u'π'),
+ (0x1D6D2, 'M', u'ρ'),
+ (0x1D6D3, 'M', u'σ'),
+ (0x1D6D5, 'M', u'τ'),
+ (0x1D6D6, 'M', u'υ'),
+ (0x1D6D7, 'M', u'φ'),
+ (0x1D6D8, 'M', u'χ'),
+ (0x1D6D9, 'M', u'ψ'),
+ (0x1D6DA, 'M', u'ω'),
+ (0x1D6DB, 'M', u'∂'),
+ (0x1D6DC, 'M', u'ε'),
+ (0x1D6DD, 'M', u'θ'),
+ (0x1D6DE, 'M', u'κ'),
+ (0x1D6DF, 'M', u'φ'),
+ (0x1D6E0, 'M', u'ρ'),
+ (0x1D6E1, 'M', u'π'),
+ (0x1D6E2, 'M', u'α'),
+ (0x1D6E3, 'M', u'β'),
+ (0x1D6E4, 'M', u'γ'),
+ (0x1D6E5, 'M', u'δ'),
+ (0x1D6E6, 'M', u'ε'),
+ (0x1D6E7, 'M', u'ζ'),
+ (0x1D6E8, 'M', u'η'),
+ (0x1D6E9, 'M', u'θ'),
+ (0x1D6EA, 'M', u'ι'),
+ (0x1D6EB, 'M', u'κ'),
+ (0x1D6EC, 'M', u'λ'),
+ (0x1D6ED, 'M', u'μ'),
+ (0x1D6EE, 'M', u'ν'),
+ (0x1D6EF, 'M', u'ξ'),
+ (0x1D6F0, 'M', u'ο'),
+ (0x1D6F1, 'M', u'π'),
+ (0x1D6F2, 'M', u'ρ'),
+ (0x1D6F3, 'M', u'θ'),
+ (0x1D6F4, 'M', u'σ'),
+ (0x1D6F5, 'M', u'τ'),
+ (0x1D6F6, 'M', u'υ'),
+ (0x1D6F7, 'M', u'φ'),
+ (0x1D6F8, 'M', u'χ'),
+ (0x1D6F9, 'M', u'ψ'),
+ (0x1D6FA, 'M', u'ω'),
+ (0x1D6FB, 'M', u'∇'),
+ (0x1D6FC, 'M', u'α'),
+ (0x1D6FD, 'M', u'β'),
+ (0x1D6FE, 'M', u'γ'),
+ (0x1D6FF, 'M', u'δ'),
+ (0x1D700, 'M', u'ε'),
+ (0x1D701, 'M', u'ζ'),
+ (0x1D702, 'M', u'η'),
+ (0x1D703, 'M', u'θ'),
+ (0x1D704, 'M', u'ι'),
+ (0x1D705, 'M', u'κ'),
+ (0x1D706, 'M', u'λ'),
+ (0x1D707, 'M', u'μ'),
+ (0x1D708, 'M', u'ν'),
+ (0x1D709, 'M', u'ξ'),
+ (0x1D70A, 'M', u'ο'),
+ (0x1D70B, 'M', u'π'),
+ (0x1D70C, 'M', u'ρ'),
+ (0x1D70D, 'M', u'σ'),
+ (0x1D70F, 'M', u'τ'),
+ (0x1D710, 'M', u'υ'),
+ (0x1D711, 'M', u'φ'),
+ (0x1D712, 'M', u'χ'),
+ (0x1D713, 'M', u'ψ'),
+ (0x1D714, 'M', u'ω'),
+ (0x1D715, 'M', u'∂'),
+ (0x1D716, 'M', u'ε'),
+ ]
+
+def _seg_67():
+ return [
+ (0x1D717, 'M', u'θ'),
+ (0x1D718, 'M', u'κ'),
+ (0x1D719, 'M', u'φ'),
+ (0x1D71A, 'M', u'ρ'),
+ (0x1D71B, 'M', u'π'),
+ (0x1D71C, 'M', u'α'),
+ (0x1D71D, 'M', u'β'),
+ (0x1D71E, 'M', u'γ'),
+ (0x1D71F, 'M', u'δ'),
+ (0x1D720, 'M', u'ε'),
+ (0x1D721, 'M', u'ζ'),
+ (0x1D722, 'M', u'η'),
+ (0x1D723, 'M', u'θ'),
+ (0x1D724, 'M', u'ι'),
+ (0x1D725, 'M', u'κ'),
+ (0x1D726, 'M', u'λ'),
+ (0x1D727, 'M', u'μ'),
+ (0x1D728, 'M', u'ν'),
+ (0x1D729, 'M', u'ξ'),
+ (0x1D72A, 'M', u'ο'),
+ (0x1D72B, 'M', u'π'),
+ (0x1D72C, 'M', u'ρ'),
+ (0x1D72D, 'M', u'θ'),
+ (0x1D72E, 'M', u'σ'),
+ (0x1D72F, 'M', u'τ'),
+ (0x1D730, 'M', u'υ'),
+ (0x1D731, 'M', u'φ'),
+ (0x1D732, 'M', u'χ'),
+ (0x1D733, 'M', u'ψ'),
+ (0x1D734, 'M', u'ω'),
+ (0x1D735, 'M', u'∇'),
+ (0x1D736, 'M', u'α'),
+ (0x1D737, 'M', u'β'),
+ (0x1D738, 'M', u'γ'),
+ (0x1D739, 'M', u'δ'),
+ (0x1D73A, 'M', u'ε'),
+ (0x1D73B, 'M', u'ζ'),
+ (0x1D73C, 'M', u'η'),
+ (0x1D73D, 'M', u'θ'),
+ (0x1D73E, 'M', u'ι'),
+ (0x1D73F, 'M', u'κ'),
+ (0x1D740, 'M', u'λ'),
+ (0x1D741, 'M', u'μ'),
+ (0x1D742, 'M', u'ν'),
+ (0x1D743, 'M', u'ξ'),
+ (0x1D744, 'M', u'ο'),
+ (0x1D745, 'M', u'π'),
+ (0x1D746, 'M', u'ρ'),
+ (0x1D747, 'M', u'σ'),
+ (0x1D749, 'M', u'τ'),
+ (0x1D74A, 'M', u'υ'),
+ (0x1D74B, 'M', u'φ'),
+ (0x1D74C, 'M', u'χ'),
+ (0x1D74D, 'M', u'ψ'),
+ (0x1D74E, 'M', u'ω'),
+ (0x1D74F, 'M', u'∂'),
+ (0x1D750, 'M', u'ε'),
+ (0x1D751, 'M', u'θ'),
+ (0x1D752, 'M', u'κ'),
+ (0x1D753, 'M', u'φ'),
+ (0x1D754, 'M', u'ρ'),
+ (0x1D755, 'M', u'π'),
+ (0x1D756, 'M', u'α'),
+ (0x1D757, 'M', u'β'),
+ (0x1D758, 'M', u'γ'),
+ (0x1D759, 'M', u'δ'),
+ (0x1D75A, 'M', u'ε'),
+ (0x1D75B, 'M', u'ζ'),
+ (0x1D75C, 'M', u'η'),
+ (0x1D75D, 'M', u'θ'),
+ (0x1D75E, 'M', u'ι'),
+ (0x1D75F, 'M', u'κ'),
+ (0x1D760, 'M', u'λ'),
+ (0x1D761, 'M', u'μ'),
+ (0x1D762, 'M', u'ν'),
+ (0x1D763, 'M', u'ξ'),
+ (0x1D764, 'M', u'ο'),
+ (0x1D765, 'M', u'π'),
+ (0x1D766, 'M', u'ρ'),
+ (0x1D767, 'M', u'θ'),
+ (0x1D768, 'M', u'σ'),
+ (0x1D769, 'M', u'τ'),
+ (0x1D76A, 'M', u'υ'),
+ (0x1D76B, 'M', u'φ'),
+ (0x1D76C, 'M', u'χ'),
+ (0x1D76D, 'M', u'ψ'),
+ (0x1D76E, 'M', u'ω'),
+ (0x1D76F, 'M', u'∇'),
+ (0x1D770, 'M', u'α'),
+ (0x1D771, 'M', u'β'),
+ (0x1D772, 'M', u'γ'),
+ (0x1D773, 'M', u'δ'),
+ (0x1D774, 'M', u'ε'),
+ (0x1D775, 'M', u'ζ'),
+ (0x1D776, 'M', u'η'),
+ (0x1D777, 'M', u'θ'),
+ (0x1D778, 'M', u'ι'),
+ (0x1D779, 'M', u'κ'),
+ (0x1D77A, 'M', u'λ'),
+ (0x1D77B, 'M', u'μ'),
+ ]
+
+def _seg_68():
+ return [
+ (0x1D77C, 'M', u'ν'),
+ (0x1D77D, 'M', u'ξ'),
+ (0x1D77E, 'M', u'ο'),
+ (0x1D77F, 'M', u'π'),
+ (0x1D780, 'M', u'ρ'),
+ (0x1D781, 'M', u'σ'),
+ (0x1D783, 'M', u'τ'),
+ (0x1D784, 'M', u'υ'),
+ (0x1D785, 'M', u'φ'),
+ (0x1D786, 'M', u'χ'),
+ (0x1D787, 'M', u'ψ'),
+ (0x1D788, 'M', u'ω'),
+ (0x1D789, 'M', u'∂'),
+ (0x1D78A, 'M', u'ε'),
+ (0x1D78B, 'M', u'θ'),
+ (0x1D78C, 'M', u'κ'),
+ (0x1D78D, 'M', u'φ'),
+ (0x1D78E, 'M', u'ρ'),
+ (0x1D78F, 'M', u'π'),
+ (0x1D790, 'M', u'α'),
+ (0x1D791, 'M', u'β'),
+ (0x1D792, 'M', u'γ'),
+ (0x1D793, 'M', u'δ'),
+ (0x1D794, 'M', u'ε'),
+ (0x1D795, 'M', u'ζ'),
+ (0x1D796, 'M', u'η'),
+ (0x1D797, 'M', u'θ'),
+ (0x1D798, 'M', u'ι'),
+ (0x1D799, 'M', u'κ'),
+ (0x1D79A, 'M', u'λ'),
+ (0x1D79B, 'M', u'μ'),
+ (0x1D79C, 'M', u'ν'),
+ (0x1D79D, 'M', u'ξ'),
+ (0x1D79E, 'M', u'ο'),
+ (0x1D79F, 'M', u'π'),
+ (0x1D7A0, 'M', u'ρ'),
+ (0x1D7A1, 'M', u'θ'),
+ (0x1D7A2, 'M', u'σ'),
+ (0x1D7A3, 'M', u'τ'),
+ (0x1D7A4, 'M', u'υ'),
+ (0x1D7A5, 'M', u'φ'),
+ (0x1D7A6, 'M', u'χ'),
+ (0x1D7A7, 'M', u'ψ'),
+ (0x1D7A8, 'M', u'ω'),
+ (0x1D7A9, 'M', u'∇'),
+ (0x1D7AA, 'M', u'α'),
+ (0x1D7AB, 'M', u'β'),
+ (0x1D7AC, 'M', u'γ'),
+ (0x1D7AD, 'M', u'δ'),
+ (0x1D7AE, 'M', u'ε'),
+ (0x1D7AF, 'M', u'ζ'),
+ (0x1D7B0, 'M', u'η'),
+ (0x1D7B1, 'M', u'θ'),
+ (0x1D7B2, 'M', u'ι'),
+ (0x1D7B3, 'M', u'κ'),
+ (0x1D7B4, 'M', u'λ'),
+ (0x1D7B5, 'M', u'μ'),
+ (0x1D7B6, 'M', u'ν'),
+ (0x1D7B7, 'M', u'ξ'),
+ (0x1D7B8, 'M', u'ο'),
+ (0x1D7B9, 'M', u'π'),
+ (0x1D7BA, 'M', u'ρ'),
+ (0x1D7BB, 'M', u'σ'),
+ (0x1D7BD, 'M', u'τ'),
+ (0x1D7BE, 'M', u'υ'),
+ (0x1D7BF, 'M', u'φ'),
+ (0x1D7C0, 'M', u'χ'),
+ (0x1D7C1, 'M', u'ψ'),
+ (0x1D7C2, 'M', u'ω'),
+ (0x1D7C3, 'M', u'∂'),
+ (0x1D7C4, 'M', u'ε'),
+ (0x1D7C5, 'M', u'θ'),
+ (0x1D7C6, 'M', u'κ'),
+ (0x1D7C7, 'M', u'φ'),
+ (0x1D7C8, 'M', u'ρ'),
+ (0x1D7C9, 'M', u'π'),
+ (0x1D7CA, 'M', u'ϝ'),
+ (0x1D7CC, 'X'),
+ (0x1D7CE, 'M', u'0'),
+ (0x1D7CF, 'M', u'1'),
+ (0x1D7D0, 'M', u'2'),
+ (0x1D7D1, 'M', u'3'),
+ (0x1D7D2, 'M', u'4'),
+ (0x1D7D3, 'M', u'5'),
+ (0x1D7D4, 'M', u'6'),
+ (0x1D7D5, 'M', u'7'),
+ (0x1D7D6, 'M', u'8'),
+ (0x1D7D7, 'M', u'9'),
+ (0x1D7D8, 'M', u'0'),
+ (0x1D7D9, 'M', u'1'),
+ (0x1D7DA, 'M', u'2'),
+ (0x1D7DB, 'M', u'3'),
+ (0x1D7DC, 'M', u'4'),
+ (0x1D7DD, 'M', u'5'),
+ (0x1D7DE, 'M', u'6'),
+ (0x1D7DF, 'M', u'7'),
+ (0x1D7E0, 'M', u'8'),
+ (0x1D7E1, 'M', u'9'),
+ (0x1D7E2, 'M', u'0'),
+ (0x1D7E3, 'M', u'1'),
+ ]
+
+def _seg_69():
+ return [
+ (0x1D7E4, 'M', u'2'),
+ (0x1D7E5, 'M', u'3'),
+ (0x1D7E6, 'M', u'4'),
+ (0x1D7E7, 'M', u'5'),
+ (0x1D7E8, 'M', u'6'),
+ (0x1D7E9, 'M', u'7'),
+ (0x1D7EA, 'M', u'8'),
+ (0x1D7EB, 'M', u'9'),
+ (0x1D7EC, 'M', u'0'),
+ (0x1D7ED, 'M', u'1'),
+ (0x1D7EE, 'M', u'2'),
+ (0x1D7EF, 'M', u'3'),
+ (0x1D7F0, 'M', u'4'),
+ (0x1D7F1, 'M', u'5'),
+ (0x1D7F2, 'M', u'6'),
+ (0x1D7F3, 'M', u'7'),
+ (0x1D7F4, 'M', u'8'),
+ (0x1D7F5, 'M', u'9'),
+ (0x1D7F6, 'M', u'0'),
+ (0x1D7F7, 'M', u'1'),
+ (0x1D7F8, 'M', u'2'),
+ (0x1D7F9, 'M', u'3'),
+ (0x1D7FA, 'M', u'4'),
+ (0x1D7FB, 'M', u'5'),
+ (0x1D7FC, 'M', u'6'),
+ (0x1D7FD, 'M', u'7'),
+ (0x1D7FE, 'M', u'8'),
+ (0x1D7FF, 'M', u'9'),
+ (0x1D800, 'V'),
+ (0x1DA8C, 'X'),
+ (0x1DA9B, 'V'),
+ (0x1DAA0, 'X'),
+ (0x1DAA1, 'V'),
+ (0x1DAB0, 'X'),
+ (0x1E000, 'V'),
+ (0x1E007, 'X'),
+ (0x1E008, 'V'),
+ (0x1E019, 'X'),
+ (0x1E01B, 'V'),
+ (0x1E022, 'X'),
+ (0x1E023, 'V'),
+ (0x1E025, 'X'),
+ (0x1E026, 'V'),
+ (0x1E02B, 'X'),
+ (0x1E100, 'V'),
+ (0x1E12D, 'X'),
+ (0x1E130, 'V'),
+ (0x1E13E, 'X'),
+ (0x1E140, 'V'),
+ (0x1E14A, 'X'),
+ (0x1E14E, 'V'),
+ (0x1E150, 'X'),
+ (0x1E2C0, 'V'),
+ (0x1E2FA, 'X'),
+ (0x1E2FF, 'V'),
+ (0x1E300, 'X'),
+ (0x1E800, 'V'),
+ (0x1E8C5, 'X'),
+ (0x1E8C7, 'V'),
+ (0x1E8D7, 'X'),
+ (0x1E900, 'M', u'𞤢'),
+ (0x1E901, 'M', u'𞤣'),
+ (0x1E902, 'M', u'𞤤'),
+ (0x1E903, 'M', u'𞤥'),
+ (0x1E904, 'M', u'𞤦'),
+ (0x1E905, 'M', u'𞤧'),
+ (0x1E906, 'M', u'𞤨'),
+ (0x1E907, 'M', u'𞤩'),
+ (0x1E908, 'M', u'𞤪'),
+ (0x1E909, 'M', u'𞤫'),
+ (0x1E90A, 'M', u'𞤬'),
+ (0x1E90B, 'M', u'𞤭'),
+ (0x1E90C, 'M', u'𞤮'),
+ (0x1E90D, 'M', u'𞤯'),
+ (0x1E90E, 'M', u'𞤰'),
+ (0x1E90F, 'M', u'𞤱'),
+ (0x1E910, 'M', u'𞤲'),
+ (0x1E911, 'M', u'𞤳'),
+ (0x1E912, 'M', u'𞤴'),
+ (0x1E913, 'M', u'𞤵'),
+ (0x1E914, 'M', u'𞤶'),
+ (0x1E915, 'M', u'𞤷'),
+ (0x1E916, 'M', u'𞤸'),
+ (0x1E917, 'M', u'𞤹'),
+ (0x1E918, 'M', u'𞤺'),
+ (0x1E919, 'M', u'𞤻'),
+ (0x1E91A, 'M', u'𞤼'),
+ (0x1E91B, 'M', u'𞤽'),
+ (0x1E91C, 'M', u'𞤾'),
+ (0x1E91D, 'M', u'𞤿'),
+ (0x1E91E, 'M', u'𞥀'),
+ (0x1E91F, 'M', u'𞥁'),
+ (0x1E920, 'M', u'𞥂'),
+ (0x1E921, 'M', u'𞥃'),
+ (0x1E922, 'V'),
+ (0x1E94C, 'X'),
+ (0x1E950, 'V'),
+ (0x1E95A, 'X'),
+ (0x1E95E, 'V'),
+ (0x1E960, 'X'),
+ ]
+
+def _seg_70():
+ return [
+ (0x1EC71, 'V'),
+ (0x1ECB5, 'X'),
+ (0x1ED01, 'V'),
+ (0x1ED3E, 'X'),
+ (0x1EE00, 'M', u'ا'),
+ (0x1EE01, 'M', u'ب'),
+ (0x1EE02, 'M', u'ج'),
+ (0x1EE03, 'M', u'د'),
+ (0x1EE04, 'X'),
+ (0x1EE05, 'M', u'و'),
+ (0x1EE06, 'M', u'ز'),
+ (0x1EE07, 'M', u'ح'),
+ (0x1EE08, 'M', u'ط'),
+ (0x1EE09, 'M', u'ي'),
+ (0x1EE0A, 'M', u'ك'),
+ (0x1EE0B, 'M', u'ل'),
+ (0x1EE0C, 'M', u'م'),
+ (0x1EE0D, 'M', u'ن'),
+ (0x1EE0E, 'M', u'س'),
+ (0x1EE0F, 'M', u'ع'),
+ (0x1EE10, 'M', u'ف'),
+ (0x1EE11, 'M', u'ص'),
+ (0x1EE12, 'M', u'ق'),
+ (0x1EE13, 'M', u'ر'),
+ (0x1EE14, 'M', u'ش'),
+ (0x1EE15, 'M', u'ت'),
+ (0x1EE16, 'M', u'ث'),
+ (0x1EE17, 'M', u'خ'),
+ (0x1EE18, 'M', u'ذ'),
+ (0x1EE19, 'M', u'ض'),
+ (0x1EE1A, 'M', u'ظ'),
+ (0x1EE1B, 'M', u'غ'),
+ (0x1EE1C, 'M', u'ٮ'),
+ (0x1EE1D, 'M', u'ں'),
+ (0x1EE1E, 'M', u'ڡ'),
+ (0x1EE1F, 'M', u'ٯ'),
+ (0x1EE20, 'X'),
+ (0x1EE21, 'M', u'ب'),
+ (0x1EE22, 'M', u'ج'),
+ (0x1EE23, 'X'),
+ (0x1EE24, 'M', u'ه'),
+ (0x1EE25, 'X'),
+ (0x1EE27, 'M', u'ح'),
+ (0x1EE28, 'X'),
+ (0x1EE29, 'M', u'ي'),
+ (0x1EE2A, 'M', u'ك'),
+ (0x1EE2B, 'M', u'ل'),
+ (0x1EE2C, 'M', u'م'),
+ (0x1EE2D, 'M', u'ن'),
+ (0x1EE2E, 'M', u'س'),
+ (0x1EE2F, 'M', u'ع'),
+ (0x1EE30, 'M', u'ف'),
+ (0x1EE31, 'M', u'ص'),
+ (0x1EE32, 'M', u'ق'),
+ (0x1EE33, 'X'),
+ (0x1EE34, 'M', u'ش'),
+ (0x1EE35, 'M', u'ت'),
+ (0x1EE36, 'M', u'ث'),
+ (0x1EE37, 'M', u'خ'),
+ (0x1EE38, 'X'),
+ (0x1EE39, 'M', u'ض'),
+ (0x1EE3A, 'X'),
+ (0x1EE3B, 'M', u'غ'),
+ (0x1EE3C, 'X'),
+ (0x1EE42, 'M', u'ج'),
+ (0x1EE43, 'X'),
+ (0x1EE47, 'M', u'ح'),
+ (0x1EE48, 'X'),
+ (0x1EE49, 'M', u'ي'),
+ (0x1EE4A, 'X'),
+ (0x1EE4B, 'M', u'ل'),
+ (0x1EE4C, 'X'),
+ (0x1EE4D, 'M', u'ن'),
+ (0x1EE4E, 'M', u'س'),
+ (0x1EE4F, 'M', u'ع'),
+ (0x1EE50, 'X'),
+ (0x1EE51, 'M', u'ص'),
+ (0x1EE52, 'M', u'ق'),
+ (0x1EE53, 'X'),
+ (0x1EE54, 'M', u'ش'),
+ (0x1EE55, 'X'),
+ (0x1EE57, 'M', u'خ'),
+ (0x1EE58, 'X'),
+ (0x1EE59, 'M', u'ض'),
+ (0x1EE5A, 'X'),
+ (0x1EE5B, 'M', u'غ'),
+ (0x1EE5C, 'X'),
+ (0x1EE5D, 'M', u'ں'),
+ (0x1EE5E, 'X'),
+ (0x1EE5F, 'M', u'ٯ'),
+ (0x1EE60, 'X'),
+ (0x1EE61, 'M', u'ب'),
+ (0x1EE62, 'M', u'ج'),
+ (0x1EE63, 'X'),
+ (0x1EE64, 'M', u'ه'),
+ (0x1EE65, 'X'),
+ (0x1EE67, 'M', u'ح'),
+ (0x1EE68, 'M', u'ط'),
+ (0x1EE69, 'M', u'ي'),
+ (0x1EE6A, 'M', u'ك'),
+ ]
+
+def _seg_71():
+ return [
+ (0x1EE6B, 'X'),
+ (0x1EE6C, 'M', u'م'),
+ (0x1EE6D, 'M', u'ن'),
+ (0x1EE6E, 'M', u'س'),
+ (0x1EE6F, 'M', u'ع'),
+ (0x1EE70, 'M', u'ف'),
+ (0x1EE71, 'M', u'ص'),
+ (0x1EE72, 'M', u'ق'),
+ (0x1EE73, 'X'),
+ (0x1EE74, 'M', u'ش'),
+ (0x1EE75, 'M', u'ت'),
+ (0x1EE76, 'M', u'ث'),
+ (0x1EE77, 'M', u'خ'),
+ (0x1EE78, 'X'),
+ (0x1EE79, 'M', u'ض'),
+ (0x1EE7A, 'M', u'ظ'),
+ (0x1EE7B, 'M', u'غ'),
+ (0x1EE7C, 'M', u'ٮ'),
+ (0x1EE7D, 'X'),
+ (0x1EE7E, 'M', u'ڡ'),
+ (0x1EE7F, 'X'),
+ (0x1EE80, 'M', u'ا'),
+ (0x1EE81, 'M', u'ب'),
+ (0x1EE82, 'M', u'ج'),
+ (0x1EE83, 'M', u'د'),
+ (0x1EE84, 'M', u'ه'),
+ (0x1EE85, 'M', u'و'),
+ (0x1EE86, 'M', u'ز'),
+ (0x1EE87, 'M', u'ح'),
+ (0x1EE88, 'M', u'ط'),
+ (0x1EE89, 'M', u'ي'),
+ (0x1EE8A, 'X'),
+ (0x1EE8B, 'M', u'ل'),
+ (0x1EE8C, 'M', u'م'),
+ (0x1EE8D, 'M', u'ن'),
+ (0x1EE8E, 'M', u'س'),
+ (0x1EE8F, 'M', u'ع'),
+ (0x1EE90, 'M', u'ف'),
+ (0x1EE91, 'M', u'ص'),
+ (0x1EE92, 'M', u'ق'),
+ (0x1EE93, 'M', u'ر'),
+ (0x1EE94, 'M', u'ش'),
+ (0x1EE95, 'M', u'ت'),
+ (0x1EE96, 'M', u'ث'),
+ (0x1EE97, 'M', u'خ'),
+ (0x1EE98, 'M', u'ذ'),
+ (0x1EE99, 'M', u'ض'),
+ (0x1EE9A, 'M', u'ظ'),
+ (0x1EE9B, 'M', u'غ'),
+ (0x1EE9C, 'X'),
+ (0x1EEA1, 'M', u'ب'),
+ (0x1EEA2, 'M', u'ج'),
+ (0x1EEA3, 'M', u'د'),
+ (0x1EEA4, 'X'),
+ (0x1EEA5, 'M', u'و'),
+ (0x1EEA6, 'M', u'ز'),
+ (0x1EEA7, 'M', u'ح'),
+ (0x1EEA8, 'M', u'ط'),
+ (0x1EEA9, 'M', u'ي'),
+ (0x1EEAA, 'X'),
+ (0x1EEAB, 'M', u'ل'),
+ (0x1EEAC, 'M', u'م'),
+ (0x1EEAD, 'M', u'ن'),
+ (0x1EEAE, 'M', u'س'),
+ (0x1EEAF, 'M', u'ع'),
+ (0x1EEB0, 'M', u'ف'),
+ (0x1EEB1, 'M', u'ص'),
+ (0x1EEB2, 'M', u'ق'),
+ (0x1EEB3, 'M', u'ر'),
+ (0x1EEB4, 'M', u'ش'),
+ (0x1EEB5, 'M', u'ت'),
+ (0x1EEB6, 'M', u'ث'),
+ (0x1EEB7, 'M', u'خ'),
+ (0x1EEB8, 'M', u'ذ'),
+ (0x1EEB9, 'M', u'ض'),
+ (0x1EEBA, 'M', u'ظ'),
+ (0x1EEBB, 'M', u'غ'),
+ (0x1EEBC, 'X'),
+ (0x1EEF0, 'V'),
+ (0x1EEF2, 'X'),
+ (0x1F000, 'V'),
+ (0x1F02C, 'X'),
+ (0x1F030, 'V'),
+ (0x1F094, 'X'),
+ (0x1F0A0, 'V'),
+ (0x1F0AF, 'X'),
+ (0x1F0B1, 'V'),
+ (0x1F0C0, 'X'),
+ (0x1F0C1, 'V'),
+ (0x1F0D0, 'X'),
+ (0x1F0D1, 'V'),
+ (0x1F0F6, 'X'),
+ (0x1F101, '3', u'0,'),
+ (0x1F102, '3', u'1,'),
+ (0x1F103, '3', u'2,'),
+ (0x1F104, '3', u'3,'),
+ (0x1F105, '3', u'4,'),
+ (0x1F106, '3', u'5,'),
+ (0x1F107, '3', u'6,'),
+ (0x1F108, '3', u'7,'),
+ ]
+
+def _seg_72():
+ return [
+ (0x1F109, '3', u'8,'),
+ (0x1F10A, '3', u'9,'),
+ (0x1F10B, 'V'),
+ (0x1F110, '3', u'(a)'),
+ (0x1F111, '3', u'(b)'),
+ (0x1F112, '3', u'(c)'),
+ (0x1F113, '3', u'(d)'),
+ (0x1F114, '3', u'(e)'),
+ (0x1F115, '3', u'(f)'),
+ (0x1F116, '3', u'(g)'),
+ (0x1F117, '3', u'(h)'),
+ (0x1F118, '3', u'(i)'),
+ (0x1F119, '3', u'(j)'),
+ (0x1F11A, '3', u'(k)'),
+ (0x1F11B, '3', u'(l)'),
+ (0x1F11C, '3', u'(m)'),
+ (0x1F11D, '3', u'(n)'),
+ (0x1F11E, '3', u'(o)'),
+ (0x1F11F, '3', u'(p)'),
+ (0x1F120, '3', u'(q)'),
+ (0x1F121, '3', u'(r)'),
+ (0x1F122, '3', u'(s)'),
+ (0x1F123, '3', u'(t)'),
+ (0x1F124, '3', u'(u)'),
+ (0x1F125, '3', u'(v)'),
+ (0x1F126, '3', u'(w)'),
+ (0x1F127, '3', u'(x)'),
+ (0x1F128, '3', u'(y)'),
+ (0x1F129, '3', u'(z)'),
+ (0x1F12A, 'M', u'〔s〕'),
+ (0x1F12B, 'M', u'c'),
+ (0x1F12C, 'M', u'r'),
+ (0x1F12D, 'M', u'cd'),
+ (0x1F12E, 'M', u'wz'),
+ (0x1F12F, 'V'),
+ (0x1F130, 'M', u'a'),
+ (0x1F131, 'M', u'b'),
+ (0x1F132, 'M', u'c'),
+ (0x1F133, 'M', u'd'),
+ (0x1F134, 'M', u'e'),
+ (0x1F135, 'M', u'f'),
+ (0x1F136, 'M', u'g'),
+ (0x1F137, 'M', u'h'),
+ (0x1F138, 'M', u'i'),
+ (0x1F139, 'M', u'j'),
+ (0x1F13A, 'M', u'k'),
+ (0x1F13B, 'M', u'l'),
+ (0x1F13C, 'M', u'm'),
+ (0x1F13D, 'M', u'n'),
+ (0x1F13E, 'M', u'o'),
+ (0x1F13F, 'M', u'p'),
+ (0x1F140, 'M', u'q'),
+ (0x1F141, 'M', u'r'),
+ (0x1F142, 'M', u's'),
+ (0x1F143, 'M', u't'),
+ (0x1F144, 'M', u'u'),
+ (0x1F145, 'M', u'v'),
+ (0x1F146, 'M', u'w'),
+ (0x1F147, 'M', u'x'),
+ (0x1F148, 'M', u'y'),
+ (0x1F149, 'M', u'z'),
+ (0x1F14A, 'M', u'hv'),
+ (0x1F14B, 'M', u'mv'),
+ (0x1F14C, 'M', u'sd'),
+ (0x1F14D, 'M', u'ss'),
+ (0x1F14E, 'M', u'ppv'),
+ (0x1F14F, 'M', u'wc'),
+ (0x1F150, 'V'),
+ (0x1F16A, 'M', u'mc'),
+ (0x1F16B, 'M', u'md'),
+ (0x1F16C, 'M', u'mr'),
+ (0x1F16D, 'V'),
+ (0x1F190, 'M', u'dj'),
+ (0x1F191, 'V'),
+ (0x1F1AE, 'X'),
+ (0x1F1E6, 'V'),
+ (0x1F200, 'M', u'ほか'),
+ (0x1F201, 'M', u'ココ'),
+ (0x1F202, 'M', u'サ'),
+ (0x1F203, 'X'),
+ (0x1F210, 'M', u'手'),
+ (0x1F211, 'M', u'字'),
+ (0x1F212, 'M', u'双'),
+ (0x1F213, 'M', u'デ'),
+ (0x1F214, 'M', u'二'),
+ (0x1F215, 'M', u'多'),
+ (0x1F216, 'M', u'解'),
+ (0x1F217, 'M', u'天'),
+ (0x1F218, 'M', u'交'),
+ (0x1F219, 'M', u'映'),
+ (0x1F21A, 'M', u'無'),
+ (0x1F21B, 'M', u'料'),
+ (0x1F21C, 'M', u'前'),
+ (0x1F21D, 'M', u'後'),
+ (0x1F21E, 'M', u'再'),
+ (0x1F21F, 'M', u'新'),
+ (0x1F220, 'M', u'初'),
+ (0x1F221, 'M', u'終'),
+ (0x1F222, 'M', u'生'),
+ (0x1F223, 'M', u'販'),
+ ]
+
+def _seg_73():
+ return [
+ (0x1F224, 'M', u'声'),
+ (0x1F225, 'M', u'吹'),
+ (0x1F226, 'M', u'演'),
+ (0x1F227, 'M', u'投'),
+ (0x1F228, 'M', u'捕'),
+ (0x1F229, 'M', u'一'),
+ (0x1F22A, 'M', u'三'),
+ (0x1F22B, 'M', u'遊'),
+ (0x1F22C, 'M', u'左'),
+ (0x1F22D, 'M', u'中'),
+ (0x1F22E, 'M', u'右'),
+ (0x1F22F, 'M', u'指'),
+ (0x1F230, 'M', u'走'),
+ (0x1F231, 'M', u'打'),
+ (0x1F232, 'M', u'禁'),
+ (0x1F233, 'M', u'空'),
+ (0x1F234, 'M', u'合'),
+ (0x1F235, 'M', u'満'),
+ (0x1F236, 'M', u'有'),
+ (0x1F237, 'M', u'月'),
+ (0x1F238, 'M', u'申'),
+ (0x1F239, 'M', u'割'),
+ (0x1F23A, 'M', u'営'),
+ (0x1F23B, 'M', u'配'),
+ (0x1F23C, 'X'),
+ (0x1F240, 'M', u'〔本〕'),
+ (0x1F241, 'M', u'〔三〕'),
+ (0x1F242, 'M', u'〔二〕'),
+ (0x1F243, 'M', u'〔安〕'),
+ (0x1F244, 'M', u'〔点〕'),
+ (0x1F245, 'M', u'〔打〕'),
+ (0x1F246, 'M', u'〔盗〕'),
+ (0x1F247, 'M', u'〔勝〕'),
+ (0x1F248, 'M', u'〔敗〕'),
+ (0x1F249, 'X'),
+ (0x1F250, 'M', u'得'),
+ (0x1F251, 'M', u'可'),
+ (0x1F252, 'X'),
+ (0x1F260, 'V'),
+ (0x1F266, 'X'),
+ (0x1F300, 'V'),
+ (0x1F6D8, 'X'),
+ (0x1F6E0, 'V'),
+ (0x1F6ED, 'X'),
+ (0x1F6F0, 'V'),
+ (0x1F6FD, 'X'),
+ (0x1F700, 'V'),
+ (0x1F774, 'X'),
+ (0x1F780, 'V'),
+ (0x1F7D9, 'X'),
+ (0x1F7E0, 'V'),
+ (0x1F7EC, 'X'),
+ (0x1F800, 'V'),
+ (0x1F80C, 'X'),
+ (0x1F810, 'V'),
+ (0x1F848, 'X'),
+ (0x1F850, 'V'),
+ (0x1F85A, 'X'),
+ (0x1F860, 'V'),
+ (0x1F888, 'X'),
+ (0x1F890, 'V'),
+ (0x1F8AE, 'X'),
+ (0x1F8B0, 'V'),
+ (0x1F8B2, 'X'),
+ (0x1F900, 'V'),
+ (0x1F979, 'X'),
+ (0x1F97A, 'V'),
+ (0x1F9CC, 'X'),
+ (0x1F9CD, 'V'),
+ (0x1FA54, 'X'),
+ (0x1FA60, 'V'),
+ (0x1FA6E, 'X'),
+ (0x1FA70, 'V'),
+ (0x1FA75, 'X'),
+ (0x1FA78, 'V'),
+ (0x1FA7B, 'X'),
+ (0x1FA80, 'V'),
+ (0x1FA87, 'X'),
+ (0x1FA90, 'V'),
+ (0x1FAA9, 'X'),
+ (0x1FAB0, 'V'),
+ (0x1FAB7, 'X'),
+ (0x1FAC0, 'V'),
+ (0x1FAC3, 'X'),
+ (0x1FAD0, 'V'),
+ (0x1FAD7, 'X'),
+ (0x1FB00, 'V'),
+ (0x1FB93, 'X'),
+ (0x1FB94, 'V'),
+ (0x1FBCB, 'X'),
+ (0x1FBF0, 'M', u'0'),
+ (0x1FBF1, 'M', u'1'),
+ (0x1FBF2, 'M', u'2'),
+ (0x1FBF3, 'M', u'3'),
+ (0x1FBF4, 'M', u'4'),
+ (0x1FBF5, 'M', u'5'),
+ (0x1FBF6, 'M', u'6'),
+ (0x1FBF7, 'M', u'7'),
+ (0x1FBF8, 'M', u'8'),
+ (0x1FBF9, 'M', u'9'),
+ ]
+
+def _seg_74():
+ return [
+ (0x1FBFA, 'X'),
+ (0x20000, 'V'),
+ (0x2A6DE, 'X'),
+ (0x2A700, 'V'),
+ (0x2B735, 'X'),
+ (0x2B740, 'V'),
+ (0x2B81E, 'X'),
+ (0x2B820, 'V'),
+ (0x2CEA2, 'X'),
+ (0x2CEB0, 'V'),
+ (0x2EBE1, 'X'),
+ (0x2F800, 'M', u'丽'),
+ (0x2F801, 'M', u'丸'),
+ (0x2F802, 'M', u'乁'),
+ (0x2F803, 'M', u'𠄢'),
+ (0x2F804, 'M', u'你'),
+ (0x2F805, 'M', u'侮'),
+ (0x2F806, 'M', u'侻'),
+ (0x2F807, 'M', u'倂'),
+ (0x2F808, 'M', u'偺'),
+ (0x2F809, 'M', u'備'),
+ (0x2F80A, 'M', u'僧'),
+ (0x2F80B, 'M', u'像'),
+ (0x2F80C, 'M', u'㒞'),
+ (0x2F80D, 'M', u'𠘺'),
+ (0x2F80E, 'M', u'免'),
+ (0x2F80F, 'M', u'兔'),
+ (0x2F810, 'M', u'兤'),
+ (0x2F811, 'M', u'具'),
+ (0x2F812, 'M', u'𠔜'),
+ (0x2F813, 'M', u'㒹'),
+ (0x2F814, 'M', u'內'),
+ (0x2F815, 'M', u'再'),
+ (0x2F816, 'M', u'𠕋'),
+ (0x2F817, 'M', u'冗'),
+ (0x2F818, 'M', u'冤'),
+ (0x2F819, 'M', u'仌'),
+ (0x2F81A, 'M', u'冬'),
+ (0x2F81B, 'M', u'况'),
+ (0x2F81C, 'M', u'𩇟'),
+ (0x2F81D, 'M', u'凵'),
+ (0x2F81E, 'M', u'刃'),
+ (0x2F81F, 'M', u'㓟'),
+ (0x2F820, 'M', u'刻'),
+ (0x2F821, 'M', u'剆'),
+ (0x2F822, 'M', u'割'),
+ (0x2F823, 'M', u'剷'),
+ (0x2F824, 'M', u'㔕'),
+ (0x2F825, 'M', u'勇'),
+ (0x2F826, 'M', u'勉'),
+ (0x2F827, 'M', u'勤'),
+ (0x2F828, 'M', u'勺'),
+ (0x2F829, 'M', u'包'),
+ (0x2F82A, 'M', u'匆'),
+ (0x2F82B, 'M', u'北'),
+ (0x2F82C, 'M', u'卉'),
+ (0x2F82D, 'M', u'卑'),
+ (0x2F82E, 'M', u'博'),
+ (0x2F82F, 'M', u'即'),
+ (0x2F830, 'M', u'卽'),
+ (0x2F831, 'M', u'卿'),
+ (0x2F834, 'M', u'𠨬'),
+ (0x2F835, 'M', u'灰'),
+ (0x2F836, 'M', u'及'),
+ (0x2F837, 'M', u'叟'),
+ (0x2F838, 'M', u'𠭣'),
+ (0x2F839, 'M', u'叫'),
+ (0x2F83A, 'M', u'叱'),
+ (0x2F83B, 'M', u'吆'),
+ (0x2F83C, 'M', u'咞'),
+ (0x2F83D, 'M', u'吸'),
+ (0x2F83E, 'M', u'呈'),
+ (0x2F83F, 'M', u'周'),
+ (0x2F840, 'M', u'咢'),
+ (0x2F841, 'M', u'哶'),
+ (0x2F842, 'M', u'唐'),
+ (0x2F843, 'M', u'啓'),
+ (0x2F844, 'M', u'啣'),
+ (0x2F845, 'M', u'善'),
+ (0x2F847, 'M', u'喙'),
+ (0x2F848, 'M', u'喫'),
+ (0x2F849, 'M', u'喳'),
+ (0x2F84A, 'M', u'嗂'),
+ (0x2F84B, 'M', u'圖'),
+ (0x2F84C, 'M', u'嘆'),
+ (0x2F84D, 'M', u'圗'),
+ (0x2F84E, 'M', u'噑'),
+ (0x2F84F, 'M', u'噴'),
+ (0x2F850, 'M', u'切'),
+ (0x2F851, 'M', u'壮'),
+ (0x2F852, 'M', u'城'),
+ (0x2F853, 'M', u'埴'),
+ (0x2F854, 'M', u'堍'),
+ (0x2F855, 'M', u'型'),
+ (0x2F856, 'M', u'堲'),
+ (0x2F857, 'M', u'報'),
+ (0x2F858, 'M', u'墬'),
+ (0x2F859, 'M', u'𡓤'),
+ (0x2F85A, 'M', u'売'),
+ (0x2F85B, 'M', u'壷'),
+ ]
+
+def _seg_75():
+ return [
+ (0x2F85C, 'M', u'夆'),
+ (0x2F85D, 'M', u'多'),
+ (0x2F85E, 'M', u'夢'),
+ (0x2F85F, 'M', u'奢'),
+ (0x2F860, 'M', u'𡚨'),
+ (0x2F861, 'M', u'𡛪'),
+ (0x2F862, 'M', u'姬'),
+ (0x2F863, 'M', u'娛'),
+ (0x2F864, 'M', u'娧'),
+ (0x2F865, 'M', u'姘'),
+ (0x2F866, 'M', u'婦'),
+ (0x2F867, 'M', u'㛮'),
+ (0x2F868, 'X'),
+ (0x2F869, 'M', u'嬈'),
+ (0x2F86A, 'M', u'嬾'),
+ (0x2F86C, 'M', u'𡧈'),
+ (0x2F86D, 'M', u'寃'),
+ (0x2F86E, 'M', u'寘'),
+ (0x2F86F, 'M', u'寧'),
+ (0x2F870, 'M', u'寳'),
+ (0x2F871, 'M', u'𡬘'),
+ (0x2F872, 'M', u'寿'),
+ (0x2F873, 'M', u'将'),
+ (0x2F874, 'X'),
+ (0x2F875, 'M', u'尢'),
+ (0x2F876, 'M', u'㞁'),
+ (0x2F877, 'M', u'屠'),
+ (0x2F878, 'M', u'屮'),
+ (0x2F879, 'M', u'峀'),
+ (0x2F87A, 'M', u'岍'),
+ (0x2F87B, 'M', u'𡷤'),
+ (0x2F87C, 'M', u'嵃'),
+ (0x2F87D, 'M', u'𡷦'),
+ (0x2F87E, 'M', u'嵮'),
+ (0x2F87F, 'M', u'嵫'),
+ (0x2F880, 'M', u'嵼'),
+ (0x2F881, 'M', u'巡'),
+ (0x2F882, 'M', u'巢'),
+ (0x2F883, 'M', u'㠯'),
+ (0x2F884, 'M', u'巽'),
+ (0x2F885, 'M', u'帨'),
+ (0x2F886, 'M', u'帽'),
+ (0x2F887, 'M', u'幩'),
+ (0x2F888, 'M', u'㡢'),
+ (0x2F889, 'M', u'𢆃'),
+ (0x2F88A, 'M', u'㡼'),
+ (0x2F88B, 'M', u'庰'),
+ (0x2F88C, 'M', u'庳'),
+ (0x2F88D, 'M', u'庶'),
+ (0x2F88E, 'M', u'廊'),
+ (0x2F88F, 'M', u'𪎒'),
+ (0x2F890, 'M', u'廾'),
+ (0x2F891, 'M', u'𢌱'),
+ (0x2F893, 'M', u'舁'),
+ (0x2F894, 'M', u'弢'),
+ (0x2F896, 'M', u'㣇'),
+ (0x2F897, 'M', u'𣊸'),
+ (0x2F898, 'M', u'𦇚'),
+ (0x2F899, 'M', u'形'),
+ (0x2F89A, 'M', u'彫'),
+ (0x2F89B, 'M', u'㣣'),
+ (0x2F89C, 'M', u'徚'),
+ (0x2F89D, 'M', u'忍'),
+ (0x2F89E, 'M', u'志'),
+ (0x2F89F, 'M', u'忹'),
+ (0x2F8A0, 'M', u'悁'),
+ (0x2F8A1, 'M', u'㤺'),
+ (0x2F8A2, 'M', u'㤜'),
+ (0x2F8A3, 'M', u'悔'),
+ (0x2F8A4, 'M', u'𢛔'),
+ (0x2F8A5, 'M', u'惇'),
+ (0x2F8A6, 'M', u'慈'),
+ (0x2F8A7, 'M', u'慌'),
+ (0x2F8A8, 'M', u'慎'),
+ (0x2F8A9, 'M', u'慌'),
+ (0x2F8AA, 'M', u'慺'),
+ (0x2F8AB, 'M', u'憎'),
+ (0x2F8AC, 'M', u'憲'),
+ (0x2F8AD, 'M', u'憤'),
+ (0x2F8AE, 'M', u'憯'),
+ (0x2F8AF, 'M', u'懞'),
+ (0x2F8B0, 'M', u'懲'),
+ (0x2F8B1, 'M', u'懶'),
+ (0x2F8B2, 'M', u'成'),
+ (0x2F8B3, 'M', u'戛'),
+ (0x2F8B4, 'M', u'扝'),
+ (0x2F8B5, 'M', u'抱'),
+ (0x2F8B6, 'M', u'拔'),
+ (0x2F8B7, 'M', u'捐'),
+ (0x2F8B8, 'M', u'𢬌'),
+ (0x2F8B9, 'M', u'挽'),
+ (0x2F8BA, 'M', u'拼'),
+ (0x2F8BB, 'M', u'捨'),
+ (0x2F8BC, 'M', u'掃'),
+ (0x2F8BD, 'M', u'揤'),
+ (0x2F8BE, 'M', u'𢯱'),
+ (0x2F8BF, 'M', u'搢'),
+ (0x2F8C0, 'M', u'揅'),
+ (0x2F8C1, 'M', u'掩'),
+ (0x2F8C2, 'M', u'㨮'),
+ ]
+
+def _seg_76():
+ return [
+ (0x2F8C3, 'M', u'摩'),
+ (0x2F8C4, 'M', u'摾'),
+ (0x2F8C5, 'M', u'撝'),
+ (0x2F8C6, 'M', u'摷'),
+ (0x2F8C7, 'M', u'㩬'),
+ (0x2F8C8, 'M', u'敏'),
+ (0x2F8C9, 'M', u'敬'),
+ (0x2F8CA, 'M', u'𣀊'),
+ (0x2F8CB, 'M', u'旣'),
+ (0x2F8CC, 'M', u'書'),
+ (0x2F8CD, 'M', u'晉'),
+ (0x2F8CE, 'M', u'㬙'),
+ (0x2F8CF, 'M', u'暑'),
+ (0x2F8D0, 'M', u'㬈'),
+ (0x2F8D1, 'M', u'㫤'),
+ (0x2F8D2, 'M', u'冒'),
+ (0x2F8D3, 'M', u'冕'),
+ (0x2F8D4, 'M', u'最'),
+ (0x2F8D5, 'M', u'暜'),
+ (0x2F8D6, 'M', u'肭'),
+ (0x2F8D7, 'M', u'䏙'),
+ (0x2F8D8, 'M', u'朗'),
+ (0x2F8D9, 'M', u'望'),
+ (0x2F8DA, 'M', u'朡'),
+ (0x2F8DB, 'M', u'杞'),
+ (0x2F8DC, 'M', u'杓'),
+ (0x2F8DD, 'M', u'𣏃'),
+ (0x2F8DE, 'M', u'㭉'),
+ (0x2F8DF, 'M', u'柺'),
+ (0x2F8E0, 'M', u'枅'),
+ (0x2F8E1, 'M', u'桒'),
+ (0x2F8E2, 'M', u'梅'),
+ (0x2F8E3, 'M', u'𣑭'),
+ (0x2F8E4, 'M', u'梎'),
+ (0x2F8E5, 'M', u'栟'),
+ (0x2F8E6, 'M', u'椔'),
+ (0x2F8E7, 'M', u'㮝'),
+ (0x2F8E8, 'M', u'楂'),
+ (0x2F8E9, 'M', u'榣'),
+ (0x2F8EA, 'M', u'槪'),
+ (0x2F8EB, 'M', u'檨'),
+ (0x2F8EC, 'M', u'𣚣'),
+ (0x2F8ED, 'M', u'櫛'),
+ (0x2F8EE, 'M', u'㰘'),
+ (0x2F8EF, 'M', u'次'),
+ (0x2F8F0, 'M', u'𣢧'),
+ (0x2F8F1, 'M', u'歔'),
+ (0x2F8F2, 'M', u'㱎'),
+ (0x2F8F3, 'M', u'歲'),
+ (0x2F8F4, 'M', u'殟'),
+ (0x2F8F5, 'M', u'殺'),
+ (0x2F8F6, 'M', u'殻'),
+ (0x2F8F7, 'M', u'𣪍'),
+ (0x2F8F8, 'M', u'𡴋'),
+ (0x2F8F9, 'M', u'𣫺'),
+ (0x2F8FA, 'M', u'汎'),
+ (0x2F8FB, 'M', u'𣲼'),
+ (0x2F8FC, 'M', u'沿'),
+ (0x2F8FD, 'M', u'泍'),
+ (0x2F8FE, 'M', u'汧'),
+ (0x2F8FF, 'M', u'洖'),
+ (0x2F900, 'M', u'派'),
+ (0x2F901, 'M', u'海'),
+ (0x2F902, 'M', u'流'),
+ (0x2F903, 'M', u'浩'),
+ (0x2F904, 'M', u'浸'),
+ (0x2F905, 'M', u'涅'),
+ (0x2F906, 'M', u'𣴞'),
+ (0x2F907, 'M', u'洴'),
+ (0x2F908, 'M', u'港'),
+ (0x2F909, 'M', u'湮'),
+ (0x2F90A, 'M', u'㴳'),
+ (0x2F90B, 'M', u'滋'),
+ (0x2F90C, 'M', u'滇'),
+ (0x2F90D, 'M', u'𣻑'),
+ (0x2F90E, 'M', u'淹'),
+ (0x2F90F, 'M', u'潮'),
+ (0x2F910, 'M', u'𣽞'),
+ (0x2F911, 'M', u'𣾎'),
+ (0x2F912, 'M', u'濆'),
+ (0x2F913, 'M', u'瀹'),
+ (0x2F914, 'M', u'瀞'),
+ (0x2F915, 'M', u'瀛'),
+ (0x2F916, 'M', u'㶖'),
+ (0x2F917, 'M', u'灊'),
+ (0x2F918, 'M', u'災'),
+ (0x2F919, 'M', u'灷'),
+ (0x2F91A, 'M', u'炭'),
+ (0x2F91B, 'M', u'𠔥'),
+ (0x2F91C, 'M', u'煅'),
+ (0x2F91D, 'M', u'𤉣'),
+ (0x2F91E, 'M', u'熜'),
+ (0x2F91F, 'X'),
+ (0x2F920, 'M', u'爨'),
+ (0x2F921, 'M', u'爵'),
+ (0x2F922, 'M', u'牐'),
+ (0x2F923, 'M', u'𤘈'),
+ (0x2F924, 'M', u'犀'),
+ (0x2F925, 'M', u'犕'),
+ (0x2F926, 'M', u'𤜵'),
+ ]
+
+def _seg_77():
+ return [
+ (0x2F927, 'M', u'𤠔'),
+ (0x2F928, 'M', u'獺'),
+ (0x2F929, 'M', u'王'),
+ (0x2F92A, 'M', u'㺬'),
+ (0x2F92B, 'M', u'玥'),
+ (0x2F92C, 'M', u'㺸'),
+ (0x2F92E, 'M', u'瑇'),
+ (0x2F92F, 'M', u'瑜'),
+ (0x2F930, 'M', u'瑱'),
+ (0x2F931, 'M', u'璅'),
+ (0x2F932, 'M', u'瓊'),
+ (0x2F933, 'M', u'㼛'),
+ (0x2F934, 'M', u'甤'),
+ (0x2F935, 'M', u'𤰶'),
+ (0x2F936, 'M', u'甾'),
+ (0x2F937, 'M', u'𤲒'),
+ (0x2F938, 'M', u'異'),
+ (0x2F939, 'M', u'𢆟'),
+ (0x2F93A, 'M', u'瘐'),
+ (0x2F93B, 'M', u'𤾡'),
+ (0x2F93C, 'M', u'𤾸'),
+ (0x2F93D, 'M', u'𥁄'),
+ (0x2F93E, 'M', u'㿼'),
+ (0x2F93F, 'M', u'䀈'),
+ (0x2F940, 'M', u'直'),
+ (0x2F941, 'M', u'𥃳'),
+ (0x2F942, 'M', u'𥃲'),
+ (0x2F943, 'M', u'𥄙'),
+ (0x2F944, 'M', u'𥄳'),
+ (0x2F945, 'M', u'眞'),
+ (0x2F946, 'M', u'真'),
+ (0x2F948, 'M', u'睊'),
+ (0x2F949, 'M', u'䀹'),
+ (0x2F94A, 'M', u'瞋'),
+ (0x2F94B, 'M', u'䁆'),
+ (0x2F94C, 'M', u'䂖'),
+ (0x2F94D, 'M', u'𥐝'),
+ (0x2F94E, 'M', u'硎'),
+ (0x2F94F, 'M', u'碌'),
+ (0x2F950, 'M', u'磌'),
+ (0x2F951, 'M', u'䃣'),
+ (0x2F952, 'M', u'𥘦'),
+ (0x2F953, 'M', u'祖'),
+ (0x2F954, 'M', u'𥚚'),
+ (0x2F955, 'M', u'𥛅'),
+ (0x2F956, 'M', u'福'),
+ (0x2F957, 'M', u'秫'),
+ (0x2F958, 'M', u'䄯'),
+ (0x2F959, 'M', u'穀'),
+ (0x2F95A, 'M', u'穊'),
+ (0x2F95B, 'M', u'穏'),
+ (0x2F95C, 'M', u'𥥼'),
+ (0x2F95D, 'M', u'𥪧'),
+ (0x2F95F, 'X'),
+ (0x2F960, 'M', u'䈂'),
+ (0x2F961, 'M', u'𥮫'),
+ (0x2F962, 'M', u'篆'),
+ (0x2F963, 'M', u'築'),
+ (0x2F964, 'M', u'䈧'),
+ (0x2F965, 'M', u'𥲀'),
+ (0x2F966, 'M', u'糒'),
+ (0x2F967, 'M', u'䊠'),
+ (0x2F968, 'M', u'糨'),
+ (0x2F969, 'M', u'糣'),
+ (0x2F96A, 'M', u'紀'),
+ (0x2F96B, 'M', u'𥾆'),
+ (0x2F96C, 'M', u'絣'),
+ (0x2F96D, 'M', u'䌁'),
+ (0x2F96E, 'M', u'緇'),
+ (0x2F96F, 'M', u'縂'),
+ (0x2F970, 'M', u'繅'),
+ (0x2F971, 'M', u'䌴'),
+ (0x2F972, 'M', u'𦈨'),
+ (0x2F973, 'M', u'𦉇'),
+ (0x2F974, 'M', u'䍙'),
+ (0x2F975, 'M', u'𦋙'),
+ (0x2F976, 'M', u'罺'),
+ (0x2F977, 'M', u'𦌾'),
+ (0x2F978, 'M', u'羕'),
+ (0x2F979, 'M', u'翺'),
+ (0x2F97A, 'M', u'者'),
+ (0x2F97B, 'M', u'𦓚'),
+ (0x2F97C, 'M', u'𦔣'),
+ (0x2F97D, 'M', u'聠'),
+ (0x2F97E, 'M', u'𦖨'),
+ (0x2F97F, 'M', u'聰'),
+ (0x2F980, 'M', u'𣍟'),
+ (0x2F981, 'M', u'䏕'),
+ (0x2F982, 'M', u'育'),
+ (0x2F983, 'M', u'脃'),
+ (0x2F984, 'M', u'䐋'),
+ (0x2F985, 'M', u'脾'),
+ (0x2F986, 'M', u'媵'),
+ (0x2F987, 'M', u'𦞧'),
+ (0x2F988, 'M', u'𦞵'),
+ (0x2F989, 'M', u'𣎓'),
+ (0x2F98A, 'M', u'𣎜'),
+ (0x2F98B, 'M', u'舁'),
+ (0x2F98C, 'M', u'舄'),
+ (0x2F98D, 'M', u'辞'),
+ ]
+
+def _seg_78():
+ return [
+ (0x2F98E, 'M', u'䑫'),
+ (0x2F98F, 'M', u'芑'),
+ (0x2F990, 'M', u'芋'),
+ (0x2F991, 'M', u'芝'),
+ (0x2F992, 'M', u'劳'),
+ (0x2F993, 'M', u'花'),
+ (0x2F994, 'M', u'芳'),
+ (0x2F995, 'M', u'芽'),
+ (0x2F996, 'M', u'苦'),
+ (0x2F997, 'M', u'𦬼'),
+ (0x2F998, 'M', u'若'),
+ (0x2F999, 'M', u'茝'),
+ (0x2F99A, 'M', u'荣'),
+ (0x2F99B, 'M', u'莭'),
+ (0x2F99C, 'M', u'茣'),
+ (0x2F99D, 'M', u'莽'),
+ (0x2F99E, 'M', u'菧'),
+ (0x2F99F, 'M', u'著'),
+ (0x2F9A0, 'M', u'荓'),
+ (0x2F9A1, 'M', u'菊'),
+ (0x2F9A2, 'M', u'菌'),
+ (0x2F9A3, 'M', u'菜'),
+ (0x2F9A4, 'M', u'𦰶'),
+ (0x2F9A5, 'M', u'𦵫'),
+ (0x2F9A6, 'M', u'𦳕'),
+ (0x2F9A7, 'M', u'䔫'),
+ (0x2F9A8, 'M', u'蓱'),
+ (0x2F9A9, 'M', u'蓳'),
+ (0x2F9AA, 'M', u'蔖'),
+ (0x2F9AB, 'M', u'𧏊'),
+ (0x2F9AC, 'M', u'蕤'),
+ (0x2F9AD, 'M', u'𦼬'),
+ (0x2F9AE, 'M', u'䕝'),
+ (0x2F9AF, 'M', u'䕡'),
+ (0x2F9B0, 'M', u'𦾱'),
+ (0x2F9B1, 'M', u'𧃒'),
+ (0x2F9B2, 'M', u'䕫'),
+ (0x2F9B3, 'M', u'虐'),
+ (0x2F9B4, 'M', u'虜'),
+ (0x2F9B5, 'M', u'虧'),
+ (0x2F9B6, 'M', u'虩'),
+ (0x2F9B7, 'M', u'蚩'),
+ (0x2F9B8, 'M', u'蚈'),
+ (0x2F9B9, 'M', u'蜎'),
+ (0x2F9BA, 'M', u'蛢'),
+ (0x2F9BB, 'M', u'蝹'),
+ (0x2F9BC, 'M', u'蜨'),
+ (0x2F9BD, 'M', u'蝫'),
+ (0x2F9BE, 'M', u'螆'),
+ (0x2F9BF, 'X'),
+ (0x2F9C0, 'M', u'蟡'),
+ (0x2F9C1, 'M', u'蠁'),
+ (0x2F9C2, 'M', u'䗹'),
+ (0x2F9C3, 'M', u'衠'),
+ (0x2F9C4, 'M', u'衣'),
+ (0x2F9C5, 'M', u'𧙧'),
+ (0x2F9C6, 'M', u'裗'),
+ (0x2F9C7, 'M', u'裞'),
+ (0x2F9C8, 'M', u'䘵'),
+ (0x2F9C9, 'M', u'裺'),
+ (0x2F9CA, 'M', u'㒻'),
+ (0x2F9CB, 'M', u'𧢮'),
+ (0x2F9CC, 'M', u'𧥦'),
+ (0x2F9CD, 'M', u'䚾'),
+ (0x2F9CE, 'M', u'䛇'),
+ (0x2F9CF, 'M', u'誠'),
+ (0x2F9D0, 'M', u'諭'),
+ (0x2F9D1, 'M', u'變'),
+ (0x2F9D2, 'M', u'豕'),
+ (0x2F9D3, 'M', u'𧲨'),
+ (0x2F9D4, 'M', u'貫'),
+ (0x2F9D5, 'M', u'賁'),
+ (0x2F9D6, 'M', u'贛'),
+ (0x2F9D7, 'M', u'起'),
+ (0x2F9D8, 'M', u'𧼯'),
+ (0x2F9D9, 'M', u'𠠄'),
+ (0x2F9DA, 'M', u'跋'),
+ (0x2F9DB, 'M', u'趼'),
+ (0x2F9DC, 'M', u'跰'),
+ (0x2F9DD, 'M', u'𠣞'),
+ (0x2F9DE, 'M', u'軔'),
+ (0x2F9DF, 'M', u'輸'),
+ (0x2F9E0, 'M', u'𨗒'),
+ (0x2F9E1, 'M', u'𨗭'),
+ (0x2F9E2, 'M', u'邔'),
+ (0x2F9E3, 'M', u'郱'),
+ (0x2F9E4, 'M', u'鄑'),
+ (0x2F9E5, 'M', u'𨜮'),
+ (0x2F9E6, 'M', u'鄛'),
+ (0x2F9E7, 'M', u'鈸'),
+ (0x2F9E8, 'M', u'鋗'),
+ (0x2F9E9, 'M', u'鋘'),
+ (0x2F9EA, 'M', u'鉼'),
+ (0x2F9EB, 'M', u'鏹'),
+ (0x2F9EC, 'M', u'鐕'),
+ (0x2F9ED, 'M', u'𨯺'),
+ (0x2F9EE, 'M', u'開'),
+ (0x2F9EF, 'M', u'䦕'),
+ (0x2F9F0, 'M', u'閷'),
+ (0x2F9F1, 'M', u'𨵷'),
+ ]
+
+def _seg_79():
+ return [
+ (0x2F9F2, 'M', u'䧦'),
+ (0x2F9F3, 'M', u'雃'),
+ (0x2F9F4, 'M', u'嶲'),
+ (0x2F9F5, 'M', u'霣'),
+ (0x2F9F6, 'M', u'𩅅'),
+ (0x2F9F7, 'M', u'𩈚'),
+ (0x2F9F8, 'M', u'䩮'),
+ (0x2F9F9, 'M', u'䩶'),
+ (0x2F9FA, 'M', u'韠'),
+ (0x2F9FB, 'M', u'𩐊'),
+ (0x2F9FC, 'M', u'䪲'),
+ (0x2F9FD, 'M', u'𩒖'),
+ (0x2F9FE, 'M', u'頋'),
+ (0x2FA00, 'M', u'頩'),
+ (0x2FA01, 'M', u'𩖶'),
+ (0x2FA02, 'M', u'飢'),
+ (0x2FA03, 'M', u'䬳'),
+ (0x2FA04, 'M', u'餩'),
+ (0x2FA05, 'M', u'馧'),
+ (0x2FA06, 'M', u'駂'),
+ (0x2FA07, 'M', u'駾'),
+ (0x2FA08, 'M', u'䯎'),
+ (0x2FA09, 'M', u'𩬰'),
+ (0x2FA0A, 'M', u'鬒'),
+ (0x2FA0B, 'M', u'鱀'),
+ (0x2FA0C, 'M', u'鳽'),
+ (0x2FA0D, 'M', u'䳎'),
+ (0x2FA0E, 'M', u'䳭'),
+ (0x2FA0F, 'M', u'鵧'),
+ (0x2FA10, 'M', u'𪃎'),
+ (0x2FA11, 'M', u'䳸'),
+ (0x2FA12, 'M', u'𪄅'),
+ (0x2FA13, 'M', u'𪈎'),
+ (0x2FA14, 'M', u'𪊑'),
+ (0x2FA15, 'M', u'麻'),
+ (0x2FA16, 'M', u'䵖'),
+ (0x2FA17, 'M', u'黹'),
+ (0x2FA18, 'M', u'黾'),
+ (0x2FA19, 'M', u'鼅'),
+ (0x2FA1A, 'M', u'鼏'),
+ (0x2FA1B, 'M', u'鼖'),
+ (0x2FA1C, 'M', u'鼻'),
+ (0x2FA1D, 'M', u'𪘀'),
+ (0x2FA1E, 'X'),
+ (0x30000, 'V'),
+ (0x3134B, 'X'),
+ (0xE0100, 'I'),
+ (0xE01F0, 'X'),
+ ]
+
+uts46data = tuple(
+ _seg_0()
+ + _seg_1()
+ + _seg_2()
+ + _seg_3()
+ + _seg_4()
+ + _seg_5()
+ + _seg_6()
+ + _seg_7()
+ + _seg_8()
+ + _seg_9()
+ + _seg_10()
+ + _seg_11()
+ + _seg_12()
+ + _seg_13()
+ + _seg_14()
+ + _seg_15()
+ + _seg_16()
+ + _seg_17()
+ + _seg_18()
+ + _seg_19()
+ + _seg_20()
+ + _seg_21()
+ + _seg_22()
+ + _seg_23()
+ + _seg_24()
+ + _seg_25()
+ + _seg_26()
+ + _seg_27()
+ + _seg_28()
+ + _seg_29()
+ + _seg_30()
+ + _seg_31()
+ + _seg_32()
+ + _seg_33()
+ + _seg_34()
+ + _seg_35()
+ + _seg_36()
+ + _seg_37()
+ + _seg_38()
+ + _seg_39()
+ + _seg_40()
+ + _seg_41()
+ + _seg_42()
+ + _seg_43()
+ + _seg_44()
+ + _seg_45()
+ + _seg_46()
+ + _seg_47()
+ + _seg_48()
+ + _seg_49()
+ + _seg_50()
+ + _seg_51()
+ + _seg_52()
+ + _seg_53()
+ + _seg_54()
+ + _seg_55()
+ + _seg_56()
+ + _seg_57()
+ + _seg_58()
+ + _seg_59()
+ + _seg_60()
+ + _seg_61()
+ + _seg_62()
+ + _seg_63()
+ + _seg_64()
+ + _seg_65()
+ + _seg_66()
+ + _seg_67()
+ + _seg_68()
+ + _seg_69()
+ + _seg_70()
+ + _seg_71()
+ + _seg_72()
+ + _seg_73()
+ + _seg_74()
+ + _seg_75()
+ + _seg_76()
+ + _seg_77()
+ + _seg_78()
+ + _seg_79()
+)
diff --git a/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/INSTALLER b/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/INSTALLER
new file mode 100644
index 0000000..a1b589e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/INSTALLER
@@ -0,0 +1 @@
+pip
diff --git a/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/LICENSE.txt b/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/LICENSE.txt
new file mode 100644
index 0000000..6fa54e6
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/LICENSE.txt
@@ -0,0 +1,20 @@
+Copyright (c) 2008-2020 The pip developers (see AUTHORS.txt file)
+
+Permission is hereby granted, free of charge, to any person obtaining
+a copy of this software and associated documentation files (the
+"Software"), to deal in the Software without restriction, including
+without limitation the rights to use, copy, modify, merge, publish,
+distribute, sublicense, and/or sell copies of the Software, and to
+permit persons to whom the Software is furnished to do so, subject to
+the following conditions:
+
+The above copyright notice and this permission notice shall be
+included in all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
+NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
+LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
+OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
+WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
diff --git a/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/METADATA b/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/METADATA
new file mode 100644
index 0000000..5b66e8e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/METADATA
@@ -0,0 +1,92 @@
+Metadata-Version: 2.1
+Name: pip
+Version: 21.0.1
+Summary: The PyPA recommended tool for installing Python packages.
+Home-page: https://pip.pypa.io/
+Author: The pip developers
+Author-email: distutils-sig@python.org
+License: MIT
+Project-URL: Documentation, https://pip.pypa.io
+Project-URL: Source, https://github.com/pypa/pip
+Project-URL: Changelog, https://pip.pypa.io/en/stable/news/
+Keywords: distutils easy_install egg setuptools wheel virtualenv
+Platform: UNKNOWN
+Classifier: Development Status :: 5 - Production/Stable
+Classifier: Intended Audience :: Developers
+Classifier: License :: OSI Approved :: MIT License
+Classifier: Topic :: Software Development :: Build Tools
+Classifier: Programming Language :: Python
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3 :: Only
+Classifier: Programming Language :: Python :: 3.6
+Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Programming Language :: Python :: 3.9
+Classifier: Programming Language :: Python :: Implementation :: CPython
+Classifier: Programming Language :: Python :: Implementation :: PyPy
+Requires-Python: >=3.6
+
+pip - The Python Package Installer
+==================================
+
+.. image:: https://img.shields.io/pypi/v/pip.svg
+ :target: https://pypi.org/project/pip/
+
+.. image:: https://readthedocs.org/projects/pip/badge/?version=latest
+ :target: https://pip.pypa.io/en/latest
+
+pip is the `package installer`_ for Python. You can use pip to install packages from the `Python Package Index`_ and other indexes.
+
+Please take a look at our documentation for how to install and use pip:
+
+* `Installation`_
+* `Usage`_
+
+We release updates regularly, with a new version every 3 months. Find more details in our documentation:
+
+* `Release notes`_
+* `Release process`_
+
+In pip 20.3, we've `made a big improvement to the heart of pip`_; `learn more`_. We want your input, so `sign up for our user experience research studies`_ to help us do it right.
+
+**Note**: pip 21.0, in January 2021, removed Python 2 support, per pip's `Python 2 support policy`_. Please migrate to Python 3.
+
+If you find bugs, need help, or want to talk to the developers, please use our mailing lists or chat rooms:
+
+* `Issue tracking`_
+* `Discourse channel`_
+* `User IRC`_
+
+If you want to get involved head over to GitHub to get the source code, look at our development documentation and feel free to jump on the developer mailing lists and chat rooms:
+
+* `GitHub page`_
+* `Development documentation`_
+* `Development mailing list`_
+* `Development IRC`_
+
+Code of Conduct
+---------------
+
+Everyone interacting in the pip project's codebases, issue trackers, chat
+rooms, and mailing lists is expected to follow the `PSF Code of Conduct`_.
+
+.. _package installer: https://packaging.python.org/guides/tool-recommendations/
+.. _Python Package Index: https://pypi.org
+.. _Installation: https://pip.pypa.io/en/stable/installing.html
+.. _Usage: https://pip.pypa.io/en/stable/
+.. _Release notes: https://pip.pypa.io/en/stable/news.html
+.. _Release process: https://pip.pypa.io/en/latest/development/release-process/
+.. _GitHub page: https://github.com/pypa/pip
+.. _Development documentation: https://pip.pypa.io/en/latest/development
+.. _made a big improvement to the heart of pip: https://pyfound.blogspot.com/2020/11/pip-20-3-new-resolver.html
+.. _learn more: https://pip.pypa.io/en/latest/user_guide/#changes-to-the-pip-dependency-resolver-in-20-3-2020
+.. _sign up for our user experience research studies: https://pyfound.blogspot.com/2020/03/new-pip-resolver-to-roll-out-this-year.html
+.. _Python 2 support policy: https://pip.pypa.io/en/latest/development/release-process/#python-2-support
+.. _Issue tracking: https://github.com/pypa/pip/issues
+.. _Discourse channel: https://discuss.python.org/c/packaging
+.. _Development mailing list: https://mail.python.org/mailman3/lists/distutils-sig.python.org/
+.. _User IRC: https://webchat.freenode.net/?channels=%23pypa
+.. _Development IRC: https://webchat.freenode.net/?channels=%23pypa-dev
+.. _PSF Code of Conduct: https://github.com/pypa/.github/blob/main/CODE_OF_CONDUCT.md
+
+
diff --git a/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/RECORD b/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/RECORD
new file mode 100644
index 0000000..113ff66
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/RECORD
@@ -0,0 +1,761 @@
+../../../bin/pip,sha256=I-EyHqBP42IOGGmLzP8VGMAGGXJjfvnNFFxBDnAZfl8,258
+../../../bin/pip3,sha256=I-EyHqBP42IOGGmLzP8VGMAGGXJjfvnNFFxBDnAZfl8,258
+../../../bin/pip3.9,sha256=I-EyHqBP42IOGGmLzP8VGMAGGXJjfvnNFFxBDnAZfl8,258
+pip-21.0.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
+pip-21.0.1.dist-info/LICENSE.txt,sha256=ejlw8iXn2TntLdOpADqlISSc1qhJJgiYAKMZmq713Gk,1110
+pip-21.0.1.dist-info/METADATA,sha256=a6mCPyb1qd3cdVI5OorlrDhSN3HHYiN8feJrxmL4QgY,4168
+pip-21.0.1.dist-info/RECORD,,
+pip-21.0.1.dist-info/WHEEL,sha256=OqRkF0eY5GHssMorFjlbTIq072vpHpF60fIQA6lS9xA,92
+pip-21.0.1.dist-info/entry_points.txt,sha256=5ExSa1s54zSPNA_1epJn5SX06786S8k5YHwskMvVYzw,125
+pip-21.0.1.dist-info/top_level.txt,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
+pip/__init__.py,sha256=N22Wk52M-ZwIU8jx64XlNaLmHk_MyL1ErZ_71RG1Pzo,473
+pip/__main__.py,sha256=WGRSG7tdJrjefIHsZOk977H_rgkSt9z2liew-Cwm09U,874
+pip/__pycache__/__init__.cpython-39.pyc,,
+pip/__pycache__/__main__.cpython-39.pyc,,
+pip/_internal/__init__.py,sha256=fnY9L5BJfq79L8CXhLnj2nJMH8-JEpJkGQAMhM231AU,512
+pip/_internal/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/__pycache__/build_env.cpython-39.pyc,,
+pip/_internal/__pycache__/cache.cpython-39.pyc,,
+pip/_internal/__pycache__/configuration.cpython-39.pyc,,
+pip/_internal/__pycache__/exceptions.cpython-39.pyc,,
+pip/_internal/__pycache__/locations.cpython-39.pyc,,
+pip/_internal/__pycache__/main.cpython-39.pyc,,
+pip/_internal/__pycache__/pyproject.cpython-39.pyc,,
+pip/_internal/__pycache__/self_outdated_check.cpython-39.pyc,,
+pip/_internal/__pycache__/wheel_builder.cpython-39.pyc,,
+pip/_internal/build_env.py,sha256=mEgguVg9YnwbVVLtwUlDF5irYsweDksk67obP0KAjE8,8323
+pip/_internal/cache.py,sha256=j4UrFmwo2xC0e1QQUVAwPVuySmQttDUGJb-myD4t-Q8,10385
+pip/_internal/cli/__init__.py,sha256=9gMw_A_StJXzDh2Rhxil6bd8tFP-ZR719Q1pINHAw5I,136
+pip/_internal/cli/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/cli/__pycache__/autocompletion.cpython-39.pyc,,
+pip/_internal/cli/__pycache__/base_command.cpython-39.pyc,,
+pip/_internal/cli/__pycache__/cmdoptions.cpython-39.pyc,,
+pip/_internal/cli/__pycache__/command_context.cpython-39.pyc,,
+pip/_internal/cli/__pycache__/main.cpython-39.pyc,,
+pip/_internal/cli/__pycache__/main_parser.cpython-39.pyc,,
+pip/_internal/cli/__pycache__/parser.cpython-39.pyc,,
+pip/_internal/cli/__pycache__/progress_bars.cpython-39.pyc,,
+pip/_internal/cli/__pycache__/req_command.cpython-39.pyc,,
+pip/_internal/cli/__pycache__/spinners.cpython-39.pyc,,
+pip/_internal/cli/__pycache__/status_codes.cpython-39.pyc,,
+pip/_internal/cli/autocompletion.py,sha256=9iPihFS8MgmENqlAQOd80nPO3XWXNbl4yTFb-ZUIn1k,6711
+pip/_internal/cli/base_command.py,sha256=phmb0p-uI7fkc3yulvrjKvLwkoGwLXoejcvLJNX51s8,8127
+pip/_internal/cli/cmdoptions.py,sha256=WAXEdfKiz4dlEm4zGd_I9GY-jbpjBo83_2nDjPJCGos,29547
+pip/_internal/cli/command_context.py,sha256=edx8WCi04cZ-1jfMg3PnngxSwr2ZvlmSDdEAMsjfOvo,978
+pip/_internal/cli/main.py,sha256=elYNVqbCVwh1I2uKioP8YPCCoSEn_jOUXtS9zgb_ymE,2641
+pip/_internal/cli/main_parser.py,sha256=nWAmIGPraVBAV0lwg7prBPcTpqzZ8r0H7mm3d05QrfM,2894
+pip/_internal/cli/parser.py,sha256=_rp5QCCrsxFsBOKSt5CAMxSOqv0RjDgZBAnih6viO9k,10404
+pip/_internal/cli/progress_bars.py,sha256=3DmxcO5HuBVpXFGKGGi5p_seeUiI4X_XV3Tl4qMg0PU,9083
+pip/_internal/cli/req_command.py,sha256=oIwJ9DbEV4tCT8-bKOvKmGwdGeXXX4NkAgqmI70-NJc,16470
+pip/_internal/cli/spinners.py,sha256=JJKIdn76dBD6xGrBvXigjnzJtIHZeZfB5gzMOH_LHiw,5614
+pip/_internal/cli/status_codes.py,sha256=1xaB32lG8Nf1nMl_6e0yy5z2Iyvv81OTUpuHwXgGsfU,122
+pip/_internal/commands/__init__.py,sha256=FknHsVy_gYqpY6Y0zKae3kFyIqQHG32ptp09w8jzSoA,3866
+pip/_internal/commands/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/commands/__pycache__/cache.cpython-39.pyc,,
+pip/_internal/commands/__pycache__/check.cpython-39.pyc,,
+pip/_internal/commands/__pycache__/completion.cpython-39.pyc,,
+pip/_internal/commands/__pycache__/configuration.cpython-39.pyc,,
+pip/_internal/commands/__pycache__/debug.cpython-39.pyc,,
+pip/_internal/commands/__pycache__/download.cpython-39.pyc,,
+pip/_internal/commands/__pycache__/freeze.cpython-39.pyc,,
+pip/_internal/commands/__pycache__/hash.cpython-39.pyc,,
+pip/_internal/commands/__pycache__/help.cpython-39.pyc,,
+pip/_internal/commands/__pycache__/install.cpython-39.pyc,,
+pip/_internal/commands/__pycache__/list.cpython-39.pyc,,
+pip/_internal/commands/__pycache__/search.cpython-39.pyc,,
+pip/_internal/commands/__pycache__/show.cpython-39.pyc,,
+pip/_internal/commands/__pycache__/uninstall.cpython-39.pyc,,
+pip/_internal/commands/__pycache__/wheel.cpython-39.pyc,,
+pip/_internal/commands/cache.py,sha256=s-5zrlvMcgDCIT0pS7fsGm6gO51JZeU5nqRgTLS7r8g,7737
+pip/_internal/commands/check.py,sha256=3_wGQDSB_vNe6WLoN-oOwJJcSRov6vq5vP1wf0bUUfw,1728
+pip/_internal/commands/completion.py,sha256=9JoflBglg7tTYAHMU3RGQ9hpM632aWlp0UNXe7qADU4,3137
+pip/_internal/commands/configuration.py,sha256=YFrjdGREeyXerrhgWAgtEDfKxQ4Gwhl8GH9PyY1FkC4,9599
+pip/_internal/commands/debug.py,sha256=NiJHfTo3RpMubodWygd99vxgwZhA10JhfwKTRBccwt8,7150
+pip/_internal/commands/download.py,sha256=7QQ6MpSOBv0E1sXlD7m9OlxBD13PDsarJVrGYp3wKHk,5227
+pip/_internal/commands/freeze.py,sha256=mH5TWSL9eQ1dwcd87CapJvWLG0XRPcNzKeZiUyv326s,3628
+pip/_internal/commands/hash.py,sha256=6qN_SwOyWw93m44BBaq4nlRc2C7cJxFyXIMwRHZBc5c,1864
+pip/_internal/commands/help.py,sha256=bh7rxWxR3_ZUhk3QCqfkreSftcUY0ijPX1eeR0z6SfM,1282
+pip/_internal/commands/install.py,sha256=TNUBOFuF7oysQFG95k3aIE-5EIJNaYIShrVprDq6HyQ,28014
+pip/_internal/commands/list.py,sha256=o-vhJtkGZ5Y5X0UIa9FJFTa_aLkFCNof_U3ltzAksC4,11753
+pip/_internal/commands/search.py,sha256=F8Ab9LbsHLUGv79ZWa_vnCGN8DZ54iHX-yFRPU9gXak,6124
+pip/_internal/commands/show.py,sha256=iEAH0ehpOGM9rP9DpkYJNZqfd5a2yL82GF37tZ2S7Yc,7140
+pip/_internal/commands/uninstall.py,sha256=lOow-0Ja0CecV0kQjg7l019PAAa19P8M8alCUM4VoxQ,3364
+pip/_internal/commands/wheel.py,sha256=TVjbMZ7BMB1pRGeAZ8z3zGdk7ZnTL76x6iSg6GYI4Fc,6931
+pip/_internal/configuration.py,sha256=5ph7m7u6j3eNlVTJIFSLMaRPl2m8mesTmAEBLG0dCPY,14225
+pip/_internal/distributions/__init__.py,sha256=FFd96Mt1zxxzsFEzbR3yL1rDmQkDhWnnLhMR6LlzboU,983
+pip/_internal/distributions/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/distributions/__pycache__/base.cpython-39.pyc,,
+pip/_internal/distributions/__pycache__/installed.cpython-39.pyc,,
+pip/_internal/distributions/__pycache__/sdist.cpython-39.pyc,,
+pip/_internal/distributions/__pycache__/wheel.cpython-39.pyc,,
+pip/_internal/distributions/base.py,sha256=Gb1nPCK39r1jmOcSbqMr-Tp5rDpB9p1BfpTqZDt4XpU,1385
+pip/_internal/distributions/installed.py,sha256=_FosTYlkY8U7BrJbyJlTLmKpVhHzIYBeaxWhn2THEbM,786
+pip/_internal/distributions/sdist.py,sha256=mbNJcb6oMuQLr4wJWKZTNldZQSQmKZJuPQk7FWPJYbg,4182
+pip/_internal/distributions/wheel.py,sha256=fu3BFBDAmhgYu2ce12TsvpcCBfuMMFwIkj9nNy0gskQ,1332
+pip/_internal/exceptions.py,sha256=hakQGwr-evbRl1NmTPqzD04fk3Lc22rtCm8EtN_JNXA,13158
+pip/_internal/index/__init__.py,sha256=x0ifDyFChwwQC4V_eHBFF1fvzLwbXRYG3nq15-Axy24,32
+pip/_internal/index/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/index/__pycache__/collector.cpython-39.pyc,,
+pip/_internal/index/__pycache__/package_finder.cpython-39.pyc,,
+pip/_internal/index/collector.py,sha256=tkMhV8szxUKFewRB-WeJB2eZu5VciVuNWlq3jgDcwOI,22547
+pip/_internal/index/package_finder.py,sha256=5c1zFdAwfIZzEQFArPnoEbAYeLtNMGVJfKl8vydSOl4,37800
+pip/_internal/locations.py,sha256=VEtA-xzIiZifWmU8YrTGsovl9TPQfE0zrTRn_MGEcSA,6485
+pip/_internal/main.py,sha256=4U06fJfknPpyb5T_SBkohFNOAde0-qIcAD0EXsvACM4,453
+pip/_internal/models/__init__.py,sha256=j2kiRfNTH6h5JVP5yO_N2Yn0DqiNzJUtaPjHe2xMcgg,65
+pip/_internal/models/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/models/__pycache__/candidate.cpython-39.pyc,,
+pip/_internal/models/__pycache__/direct_url.cpython-39.pyc,,
+pip/_internal/models/__pycache__/format_control.cpython-39.pyc,,
+pip/_internal/models/__pycache__/index.cpython-39.pyc,,
+pip/_internal/models/__pycache__/link.cpython-39.pyc,,
+pip/_internal/models/__pycache__/scheme.cpython-39.pyc,,
+pip/_internal/models/__pycache__/search_scope.cpython-39.pyc,,
+pip/_internal/models/__pycache__/selection_prefs.cpython-39.pyc,,
+pip/_internal/models/__pycache__/target_python.cpython-39.pyc,,
+pip/_internal/models/__pycache__/wheel.cpython-39.pyc,,
+pip/_internal/models/candidate.py,sha256=WMx9i7VnMtcraKXDWBSxR_WOECCEiopWy8LUcntUrvw,1208
+pip/_internal/models/direct_url.py,sha256=GGLCpPIGVKKYOR-0Lp55cGc0lPMoPWsUumWChA4BiDY,6913
+pip/_internal/models/format_control.py,sha256=YUBHSFsmBxmGDJytEJhCOA6YWaUcRq6ufvWlleehqKI,2802
+pip/_internal/models/index.py,sha256=idYm7uI8E8NQ9Sy3MqhnnRphnDmN4iTj5UbY5f7Llcg,1126
+pip/_internal/models/link.py,sha256=ZnNyXIo5Pn2YC_63q1CFOdrS_GdEMOjyolXk6_69adE,7639
+pip/_internal/models/scheme.py,sha256=sZ18s2TzMgmHZzwm8PMDX-hy4wxTQrSKUoNcCYS2F34,801
+pip/_internal/models/search_scope.py,sha256=GLnSaJKgMsSuCqCatTmDypSiKZI6Ny1CgRCX2GPTFoA,4835
+pip/_internal/models/selection_prefs.py,sha256=gY2ynXkzHLKBL1UZ2d5Pl6Q4m-Mp6CPg-ufZesUxyEY,2087
+pip/_internal/models/target_python.py,sha256=x57cmp9eb9zHZXncctwAdL4H1Rx_J27U5gaYrCQ2u7M,4169
+pip/_internal/models/wheel.py,sha256=KFKEmqRIScyHEfqovopluujgd1uRXZMTsWte5Eh9sPY,2834
+pip/_internal/network/__init__.py,sha256=IEtuAPVGqBTS0C7M0KJ95xqGcA76coOc2AsDcgIBP-8,52
+pip/_internal/network/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/network/__pycache__/auth.cpython-39.pyc,,
+pip/_internal/network/__pycache__/cache.cpython-39.pyc,,
+pip/_internal/network/__pycache__/download.cpython-39.pyc,,
+pip/_internal/network/__pycache__/lazy_wheel.cpython-39.pyc,,
+pip/_internal/network/__pycache__/session.cpython-39.pyc,,
+pip/_internal/network/__pycache__/utils.cpython-39.pyc,,
+pip/_internal/network/__pycache__/xmlrpc.cpython-39.pyc,,
+pip/_internal/network/auth.py,sha256=svvGxUsGxBmqNmX8T4kPi9PvZpe1eCBTZJk8rFkzkcw,11895
+pip/_internal/network/cache.py,sha256=zUu27h3fS78yIeSNuEB9sq_2GjFJ6S2RpRLCwQSFy64,2378
+pip/_internal/network/download.py,sha256=UONxxUxEwgNEfFN_lappdU_6NBa4RXqZ00LeHBOuL20,6587
+pip/_internal/network/lazy_wheel.py,sha256=Qgl8wi3_188W4nOjKi-L_J3OUqUKq0Ge0dDsxTSCFaY,8293
+pip/_internal/network/session.py,sha256=5N0_AdbQ_RnFLd2fg9ZMSuz22iaBSnM27LIoNotAtgQ,15643
+pip/_internal/network/utils.py,sha256=BpdHEUSN7ONrdradjr8BEe2mL5rlj8xFFtmYbMzhHz0,4266
+pip/_internal/network/xmlrpc.py,sha256=GgX5TAU3s50jlAmdwaUG7m_iV06rbPtaI5_NGd53AVo,1871
+pip/_internal/operations/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+pip/_internal/operations/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/operations/__pycache__/check.cpython-39.pyc,,
+pip/_internal/operations/__pycache__/freeze.cpython-39.pyc,,
+pip/_internal/operations/__pycache__/prepare.cpython-39.pyc,,
+pip/_internal/operations/build/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+pip/_internal/operations/build/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/operations/build/__pycache__/metadata.cpython-39.pyc,,
+pip/_internal/operations/build/__pycache__/metadata_legacy.cpython-39.pyc,,
+pip/_internal/operations/build/__pycache__/wheel.cpython-39.pyc,,
+pip/_internal/operations/build/__pycache__/wheel_legacy.cpython-39.pyc,,
+pip/_internal/operations/build/metadata.py,sha256=R1ir---xEH_Nx93KgH-tWYbkUYe64aQFLEO_b2NddEE,1293
+pip/_internal/operations/build/metadata_legacy.py,sha256=oqD7-jGQSHUxP90E2muEbwbjz5Bh1N1nUWWKxWOpwjI,2080
+pip/_internal/operations/build/wheel.py,sha256=9O1d7GiejX4V0d6jKNXc9TvuY0_CYg5a9L41UpFrB_o,1505
+pip/_internal/operations/build/wheel_legacy.py,sha256=hJ4tiC_fFzyIM2uoAFfd4Fk-CBD1AeJJyw_XZOFBlN4,3426
+pip/_internal/operations/check.py,sha256=WFd1l5wlpyOL7u8dZsSFDi9lzoZ-lRA7L1VLYiYH2kY,5371
+pip/_internal/operations/freeze.py,sha256=JNF8zG9BuqJ2fWJgQZV0H7eaiKdMNNLx5xc8gvlA9k0,10442
+pip/_internal/operations/install/__init__.py,sha256=Zug2xxRJjeI2LdVd45iwmeavUBYXA4ltbhFFwc4BEOg,53
+pip/_internal/operations/install/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/operations/install/__pycache__/editable_legacy.cpython-39.pyc,,
+pip/_internal/operations/install/__pycache__/legacy.cpython-39.pyc,,
+pip/_internal/operations/install/__pycache__/wheel.cpython-39.pyc,,
+pip/_internal/operations/install/editable_legacy.py,sha256=gEMhkmuxJ0wfqKspmhu_i41Ltq1_F-4IrTYPxsWVxEw,1540
+pip/_internal/operations/install/legacy.py,sha256=8aby3xQI90LPZOx7uDWvBV2cehOpGkj4Lb4js7bCrZA,4403
+pip/_internal/operations/install/wheel.py,sha256=dCig7D4jT4PMjtyLuhuMwDfD8kzXgN9ra8B9uyeMUYg,31158
+pip/_internal/operations/prepare.py,sha256=t5rXOX388PznqycvpLNGYRLpCcqsniKtmAzY9FSvXQg,22142
+pip/_internal/pyproject.py,sha256=yxwQz1Ib3r2dX6mp6qecZwiDThLK13eT2dkWW67swkA,7333
+pip/_internal/req/__init__.py,sha256=BUtbA3pab69f5WjMnLnkoXTEKSo8ZxEFsaYyf0Ql8m0,3170
+pip/_internal/req/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/req/__pycache__/constructors.cpython-39.pyc,,
+pip/_internal/req/__pycache__/req_file.cpython-39.pyc,,
+pip/_internal/req/__pycache__/req_install.cpython-39.pyc,,
+pip/_internal/req/__pycache__/req_set.cpython-39.pyc,,
+pip/_internal/req/__pycache__/req_tracker.cpython-39.pyc,,
+pip/_internal/req/__pycache__/req_uninstall.cpython-39.pyc,,
+pip/_internal/req/constructors.py,sha256=-AhfcM1mO5Ng105XMcyA2k2ylp_rK513b4dPJZUbmRs,16035
+pip/_internal/req/req_file.py,sha256=Uaga43m7o24fKKwZEJmgFMmdSd-t-mV5Zvp8VwzDpiw,18552
+pip/_internal/req/req_install.py,sha256=PHz0mt8zWwRBxn8kcWPmt_eu1i3HHwUMjmFaQ-hl3TI,33145
+pip/_internal/req/req_set.py,sha256=CcW_X-n0cB81rmEixPOBmalXg7hOI4aOHMf19s1TVpA,8013
+pip/_internal/req/req_tracker.py,sha256=DcFBkSFp2ysr5x90NExMHlwus15mclFDj4_vNJrYcTk,4656
+pip/_internal/req/req_uninstall.py,sha256=TAfa5mYhhIYpnIITDz6hUy1z00ME_mybAhx0eekhp1s,24255
+pip/_internal/resolution/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+pip/_internal/resolution/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/resolution/__pycache__/base.cpython-39.pyc,,
+pip/_internal/resolution/base.py,sha256=df4S86qAB5UMkgzJ3ZNgxtkHusg9UywRF2J8ef91qPw,696
+pip/_internal/resolution/legacy/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+pip/_internal/resolution/legacy/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/resolution/legacy/__pycache__/resolver.cpython-39.pyc,,
+pip/_internal/resolution/legacy/resolver.py,sha256=mXfkbMGIUMYmdHkH7esyyLYKpuhIsvhKgPD_P884QXI,18692
+pip/_internal/resolution/resolvelib/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+pip/_internal/resolution/resolvelib/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/resolution/resolvelib/__pycache__/base.cpython-39.pyc,,
+pip/_internal/resolution/resolvelib/__pycache__/candidates.cpython-39.pyc,,
+pip/_internal/resolution/resolvelib/__pycache__/factory.cpython-39.pyc,,
+pip/_internal/resolution/resolvelib/__pycache__/found_candidates.cpython-39.pyc,,
+pip/_internal/resolution/resolvelib/__pycache__/provider.cpython-39.pyc,,
+pip/_internal/resolution/resolvelib/__pycache__/reporter.cpython-39.pyc,,
+pip/_internal/resolution/resolvelib/__pycache__/requirements.cpython-39.pyc,,
+pip/_internal/resolution/resolvelib/__pycache__/resolver.cpython-39.pyc,,
+pip/_internal/resolution/resolvelib/base.py,sha256=pCjC_evBGsbYZhseXluXMCKkyAS-LNkqRqSj-AHkXdw,5193
+pip/_internal/resolution/resolvelib/candidates.py,sha256=hOC1JSpzE0Yt-4J9l7G7tUIBSIyWBUVF6JaowJJ1a_8,20211
+pip/_internal/resolution/resolvelib/factory.py,sha256=avshtHNs5nrlExEbDcTLBMnlkNGXwQBUkSHX0Kk9Yxc,19228
+pip/_internal/resolution/resolvelib/found_candidates.py,sha256=P5C46Bj91edImB8tUNFkC5Zr75HhGgwrY6cjRbMf3fs,5479
+pip/_internal/resolution/resolvelib/provider.py,sha256=7_LjNcubdcjMEnehHiaN3iewSPu4m4coS2mTT7PQzp8,7513
+pip/_internal/resolution/resolvelib/reporter.py,sha256=-bdsWDecNezUPSWbSHEX-xNyVP8Tkd0zO96bhxFr1Vg,2941
+pip/_internal/resolution/resolvelib/requirements.py,sha256=mRJLY0Rc_muoYqZ7XDEYYmJN7Aw4L3sq4ywKimy_UZE,6162
+pip/_internal/resolution/resolvelib/resolver.py,sha256=MgZmX_N5J5xqwc-f0tvNyjnFCfWbrgtUlWmCHOh677g,11912
+pip/_internal/self_outdated_check.py,sha256=JsHoDtt7VjsGO7ADfj-SvUlU-Ul_B_RI9pTA0ma1ZLI,6858
+pip/_internal/utils/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+pip/_internal/utils/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/appdirs.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/compat.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/compatibility_tags.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/datetime.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/deprecation.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/direct_url_helpers.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/distutils_args.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/encoding.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/entrypoints.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/filesystem.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/filetypes.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/glibc.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/hashes.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/inject_securetransport.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/logging.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/misc.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/models.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/packaging.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/parallel.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/pkg_resources.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/setuptools_build.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/subprocess.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/temp_dir.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/typing.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/unpacking.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/urls.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/virtualenv.cpython-39.pyc,,
+pip/_internal/utils/__pycache__/wheel.cpython-39.pyc,,
+pip/_internal/utils/appdirs.py,sha256=AyrjuIvYUfsCa7CncTTUOg1BxlERcwI42pSWZmsnNNM,1351
+pip/_internal/utils/compat.py,sha256=IQCUn_HI5WEB6UJrwQ19nmTKmrqTF_eQU7And3MbikM,5274
+pip/_internal/utils/compatibility_tags.py,sha256=Ef2fJYjWAqnFPMeR0FOlxTh47lu0SBrenQm0kxf-J2g,5816
+pip/_internal/utils/datetime.py,sha256=OLKI87v18hQDVOyRMuyPX15nbCUWO4rMc9bAm2KpkGc,267
+pip/_internal/utils/deprecation.py,sha256=jIp8XEEK76VrUV9LgEA_i9nhriAYWUvU5Oly1L9YyKw,3380
+pip/_internal/utils/direct_url_helpers.py,sha256=vnN-_foyl_4ooLdhmrudTxfWLaIBgstyxPENL2mS2H4,4152
+pip/_internal/utils/distutils_args.py,sha256=conKwlSsvwBEnIISbWnqEwZwch-kyq3hQ0nKOBWPIm4,1398
+pip/_internal/utils/encoding.py,sha256=QAG90ZRmRCTA80YHRwjdvLQVFCRmhkxebvhgDBawWyY,1317
+pip/_internal/utils/entrypoints.py,sha256=CwZKpoIbCxxCTV91Nkz6ReFZQnrSGBbruipbiGoaCLY,1183
+pip/_internal/utils/filesystem.py,sha256=OaK_UFxXDjXOvAPuNE-bb8eeuwbdZcHnVCN18yAK-98,6389
+pip/_internal/utils/filetypes.py,sha256=tvz0Rml52zC_GZfgQxhdR10vP0qAxED-3cDyPrdFajM,873
+pip/_internal/utils/glibc.py,sha256=VZsrQxHLTFxZWWKcEDi7x6jIE-qwF8reZC6_m8MyhMw,3353
+pip/_internal/utils/hashes.py,sha256=DB6Pd5fCdWq-lMRfEE2mMCTkjmPfMTfgmqMzMmk50WE,5203
+pip/_internal/utils/inject_securetransport.py,sha256=-ATb-4xYRxDqnAoJY5z6sflj1O0XF81Hk2tEroxrSo8,846
+pip/_internal/utils/logging.py,sha256=IZ2eLA6UyI9n3cK5nJED7Z4PBb4dv4Y5CopQHxUjwj4,11448
+pip/_internal/utils/misc.py,sha256=yxEkMv1xYRa_GLbb6qDuLWx5jwYksAnlrugbpx-30p0,27604
+pip/_internal/utils/models.py,sha256=CSUxzZEuh9va4KE5tUpOtjVW7a8Pd003XVI4eNMgXSE,1151
+pip/_internal/utils/packaging.py,sha256=UEZLCFmt-9BiRBfmLwWlt8D2nGkpLIReRq7gzGYLLMI,3089
+pip/_internal/utils/parallel.py,sha256=Hb_H-6sptwWCN4VGzEwbs8_rxh6B6q8FBkgIBM2Dz-M,3432
+pip/_internal/utils/pkg_resources.py,sha256=LD_Y6KCr0MEBbt4KDWe1r077l9qpeUUd61o18TEhE_Q,1282
+pip/_internal/utils/setuptools_build.py,sha256=OQO4vmlFjXWPFTdQwTnMzpUOi9OPtYdq6tbZ1rL8YCI,5239
+pip/_internal/utils/subprocess.py,sha256=UX9CHNORjhqVcedNG1Jjwsl0Kw40v63Dso7RmpwmvFc,11078
+pip/_internal/utils/temp_dir.py,sha256=G2qWEirX_8ablfvxKzkHFIh2jCBF4OfOCrzgBDFUuIY,8983
+pip/_internal/utils/typing.py,sha256=6T7qX9SYEJMUwgn2ZqdhM-SSmDwWTIzRHkeL49Q10-I,1439
+pip/_internal/utils/unpacking.py,sha256=4AvWThNDIpEA0GdO9GUSSN2VLnp9HPDOxTMhcm8_KO8,9464
+pip/_internal/utils/urls.py,sha256=Z3ClgEtfZIdrU-YLRA6uVVMvejy-EwdxzrnqjZj0eu8,1452
+pip/_internal/utils/virtualenv.py,sha256=Hm8fXwb_xWBt-HxD-0wEasli_BA6eB3RWVVkyzwS37s,3769
+pip/_internal/utils/wheel.py,sha256=WeowquCm4hd7REee-RX0Q-t6Yq11db9ze6bY4OFaWLs,7304
+pip/_internal/vcs/__init__.py,sha256=Ovj2REzS3fFosLAKw5lnd3CX76J2nN9b1FNY6KluBgE,632
+pip/_internal/vcs/__pycache__/__init__.cpython-39.pyc,,
+pip/_internal/vcs/__pycache__/bazaar.cpython-39.pyc,,
+pip/_internal/vcs/__pycache__/git.cpython-39.pyc,,
+pip/_internal/vcs/__pycache__/mercurial.cpython-39.pyc,,
+pip/_internal/vcs/__pycache__/subversion.cpython-39.pyc,,
+pip/_internal/vcs/__pycache__/versioncontrol.cpython-39.pyc,,
+pip/_internal/vcs/bazaar.py,sha256=9QxXZaQY4rCnfJ4rbHHqDQTdb248-fy3cS_wuF_A5YQ,3786
+pip/_internal/vcs/git.py,sha256=jwXPCDxSUq03H23c0dVg9x3nr8jAtEoZytFl_UTFFoI,15935
+pip/_internal/vcs/mercurial.py,sha256=_aONuC99EOc_UOP4rOp29dzcea7oefVu5fm6LY4y9IE,5705
+pip/_internal/vcs/subversion.py,sha256=cZpJVzGN11yxVLAYcm9PcrZ65C2gChDACL1cd952snk,13019
+pip/_internal/vcs/versioncontrol.py,sha256=WMBOrxHPzV3SE8tL8-slNM6d9M4HlOMd9xLehZZ6wmM,23848
+pip/_internal/wheel_builder.py,sha256=_ZomhgGv70RyqYe0FPyYkPFuQEwl3jJ-jOCJWBfaCSk,12202
+pip/_vendor/__init__.py,sha256=xsgffPuXJIsmc6cAP0jW-u7WUZ8TMF35kfixn9lmPMk,4902
+pip/_vendor/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/__pycache__/appdirs.cpython-39.pyc,,
+pip/_vendor/__pycache__/contextlib2.cpython-39.pyc,,
+pip/_vendor/__pycache__/distro.cpython-39.pyc,,
+pip/_vendor/__pycache__/pyparsing.cpython-39.pyc,,
+pip/_vendor/__pycache__/retrying.cpython-39.pyc,,
+pip/_vendor/__pycache__/six.cpython-39.pyc,,
+pip/_vendor/appdirs.py,sha256=Od1rs7d0yMmHLUc0FQn2DleIUbC--EEmM-UtXvFqAjM,26540
+pip/_vendor/cachecontrol/__init__.py,sha256=SR74BEsga7Z2I6-CH8doh2Oq_vH0GG7RCwjJg7TntdI,313
+pip/_vendor/cachecontrol/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/cachecontrol/__pycache__/_cmd.cpython-39.pyc,,
+pip/_vendor/cachecontrol/__pycache__/adapter.cpython-39.pyc,,
+pip/_vendor/cachecontrol/__pycache__/cache.cpython-39.pyc,,
+pip/_vendor/cachecontrol/__pycache__/compat.cpython-39.pyc,,
+pip/_vendor/cachecontrol/__pycache__/controller.cpython-39.pyc,,
+pip/_vendor/cachecontrol/__pycache__/filewrapper.cpython-39.pyc,,
+pip/_vendor/cachecontrol/__pycache__/heuristics.cpython-39.pyc,,
+pip/_vendor/cachecontrol/__pycache__/serialize.cpython-39.pyc,,
+pip/_vendor/cachecontrol/__pycache__/wrapper.cpython-39.pyc,,
+pip/_vendor/cachecontrol/_cmd.py,sha256=KIO6PIJoXmNr5RGS2pZjDum1-40oR4fw5kE0LguxrY4,1352
+pip/_vendor/cachecontrol/adapter.py,sha256=FBRrYfpkXaH8hKogEgw6wYCScnL2SJFDZlHBNF0EvLE,5015
+pip/_vendor/cachecontrol/cache.py,sha256=gCo5R0D__iptJ49dUfxwWfu2Lc2OjpDs-MERy2hTpK8,844
+pip/_vendor/cachecontrol/caches/__init__.py,sha256=rN8Ox5dd2ucPtgkybgz77XfTTUL4HFTO2-n2ACK2q3E,88
+pip/_vendor/cachecontrol/caches/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/cachecontrol/caches/__pycache__/file_cache.cpython-39.pyc,,
+pip/_vendor/cachecontrol/caches/__pycache__/redis_cache.cpython-39.pyc,,
+pip/_vendor/cachecontrol/caches/file_cache.py,sha256=tw35e4ZnOsxqrlZ2fQ2VYz2FlUlCbFMerNu2tPwtRHY,4299
+pip/_vendor/cachecontrol/caches/redis_cache.py,sha256=hFJ_J9MCUTjblJCBT_cV_glP--2toqHDCKLRGUIHSOQ,889
+pip/_vendor/cachecontrol/compat.py,sha256=3BisP29GBHAo0QxUrbpBsMeXSp8YzKQcGHwEW7VYU2U,724
+pip/_vendor/cachecontrol/controller.py,sha256=fTDK1V7NjpnU1hwfMboX4Vyh73-uWgL6QkghtvvyTrY,14525
+pip/_vendor/cachecontrol/filewrapper.py,sha256=YsK9ISeZg26n-rS0z7MdEcMTyQ9gW_fLb6zIRJvE2rg,2613
+pip/_vendor/cachecontrol/heuristics.py,sha256=yndlfXHJZ5u_TC1ECrV4fVl68OuWiXnDS0HPyscK1MM,4205
+pip/_vendor/cachecontrol/serialize.py,sha256=7Jq5PcVBH6RVI-qkKkQsV5yAiZCFQa7yFhvITw_DYsc,7279
+pip/_vendor/cachecontrol/wrapper.py,sha256=tKJnzRvbl7uJRxOChwlNLdJf9NR0QlnknQxgNzQW2kM,719
+pip/_vendor/certifi/__init__.py,sha256=yNK-O9MHyQX1qYVnBuiU97REsFFEMimhp3MnaIh9Kbc,65
+pip/_vendor/certifi/__main__.py,sha256=4JJNpOgznsXzgISGReUBrJGB6Q4zJOlIV99WFE185fM,267
+pip/_vendor/certifi/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/certifi/__pycache__/__main__.cpython-39.pyc,,
+pip/_vendor/certifi/__pycache__/core.cpython-39.pyc,,
+pip/_vendor/certifi/cacert.pem,sha256=u3fxPT--yemLvyislQRrRBlsfY9Vq3cgBh6ZmRqCkZc,263774
+pip/_vendor/certifi/core.py,sha256=WCYiIkg5ozbypABAcRagDOa9DCO2qgnf66GZ1SRgmWA,2375
+pip/_vendor/chardet/__init__.py,sha256=yxky3TQpsr5YTFEf5XYv0O4wq2e1WSilELYZ9e2AEes,3354
+pip/_vendor/chardet/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/big5freq.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/big5prober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/chardistribution.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/charsetgroupprober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/charsetprober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/codingstatemachine.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/compat.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/cp949prober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/enums.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/escprober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/escsm.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/eucjpprober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/euckrfreq.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/euckrprober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/euctwfreq.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/euctwprober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/gb2312freq.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/gb2312prober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/hebrewprober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/jisfreq.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/jpcntx.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/langbulgarianmodel.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/langgreekmodel.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/langhebrewmodel.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/langhungarianmodel.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/langrussianmodel.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/langthaimodel.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/langturkishmodel.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/latin1prober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/mbcharsetprober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/mbcsgroupprober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/mbcssm.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/sbcharsetprober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/sbcsgroupprober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/sjisprober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/universaldetector.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/utf8prober.cpython-39.pyc,,
+pip/_vendor/chardet/__pycache__/version.cpython-39.pyc,,
+pip/_vendor/chardet/big5freq.py,sha256=dwRzRlsGp3Zgr1JQSSSwnxvyaZ7_q-5kuPfCVMuy4to,31640
+pip/_vendor/chardet/big5prober.py,sha256=TpmdoNfRtnQ7x9Q_p-a1CHaG-ok2mbisN5e9UHAtOiY,1804
+pip/_vendor/chardet/chardistribution.py,sha256=NzboAhfS6GODy_Tp6BkmUOL4NuxwTVfdVFcKA9bdUAo,9644
+pip/_vendor/chardet/charsetgroupprober.py,sha256=NPYh0Agp8UnrfqIls_qdbwszQ1mv9imGawGUCErFT6M,3946
+pip/_vendor/chardet/charsetprober.py,sha256=kk5-m0VdjqzbEhPRkBZ386R3fBQo3DxsBrdL-WFyk1o,5255
+pip/_vendor/chardet/cli/__init__.py,sha256=frcCV1k9oG9oKj3dpUqdJg1PxRT2RSN_XKdLCPjaYaY,2
+pip/_vendor/chardet/cli/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/chardet/cli/__pycache__/chardetect.cpython-39.pyc,,
+pip/_vendor/chardet/cli/chardetect.py,sha256=535zsG4tA_x-_xPtEeDvn46QLib2nvF-5NT_nJdGgVs,2831
+pip/_vendor/chardet/codingstatemachine.py,sha256=qz9ZwK1q4mZ4s4zDRbyXu5KaGunYbk7g1Z7fqfb4mA4,3678
+pip/_vendor/chardet/compat.py,sha256=3j2eGvEAakISaIanHZ4wZutzfttNdRSdlo6RSjpyxsM,1236
+pip/_vendor/chardet/cp949prober.py,sha256=5NnMVUcel3jDY3w8ljD0cXyj2lcrvdagxOVE1jxl7xc,1904
+pip/_vendor/chardet/enums.py,sha256=3H_EIVP-VUYOdKqe2xmYdyooEDSLqS8sACMbn_3oejU,1737
+pip/_vendor/chardet/escprober.py,sha256=5MrTnVtZGEt3ssnY-lOXmOo3JY-CIqz9ruG3KjDpkbY,4051
+pip/_vendor/chardet/escsm.py,sha256=xQbwmM3Ieuskg-Aohyc6-bSfg3vsY0tx2TEKLDoVZGg,10756
+pip/_vendor/chardet/eucjpprober.py,sha256=PHumemJS19xMhDR4xPrsvxMfyBfsb297kVWmYz6zgy8,3841
+pip/_vendor/chardet/euckrfreq.py,sha256=MrLrIWMtlaDI0LYt-MM3MougBbLtSWHs6kvZx0VasIM,13741
+pip/_vendor/chardet/euckrprober.py,sha256=VbiOn7_id7mL9Q5GdeV0Ze3w5fG0nRCpUkEzeR-bnnY,1795
+pip/_vendor/chardet/euctwfreq.py,sha256=ZPBIHZDwNknGf7m6r4xGH8bX0W38qBpnTwVVv1QHw_M,32008
+pip/_vendor/chardet/euctwprober.py,sha256=hlUyGKUxzOPfBxCcyUcvRZSxgkLuvRoDU9wejp6YMiM,1793
+pip/_vendor/chardet/gb2312freq.py,sha256=aLHs-2GS8vmSM2ljyoWWgeVq_xRRcS_gN7ykpIiV43A,20998
+pip/_vendor/chardet/gb2312prober.py,sha256=msVbrDFcrJRE_XvsyETiqbTGfvdFhVIEZ2zBd-OENaE,1800
+pip/_vendor/chardet/hebrewprober.py,sha256=r81LqgKb24ZbvOmfi95MzItUxx7bkrjJR1ppkj5rvZw,14130
+pip/_vendor/chardet/jisfreq.py,sha256=vrqCR4CmwownBVXJ3Hh_gsfiDnIHOELbcNmTyC6Jx3w,26102
+pip/_vendor/chardet/jpcntx.py,sha256=Cn4cypo2y8CpqCan-zsdfYdEgXkRCnsqQoYaCu6FRjI,19876
+pip/_vendor/chardet/langbulgarianmodel.py,sha256=IuDOQ4uAe5spaYXt1F-2_496DFYd3J5lyLKKbVg-Nkw,110347
+pip/_vendor/chardet/langgreekmodel.py,sha256=cZRowhYjEUNYCevhuD5ZMHMiOIf3Pk1IpRixjTpRPB0,103969
+pip/_vendor/chardet/langhebrewmodel.py,sha256=p-xw_b2XvGVSIQFgQL91cVpS7u3vPpGJZ0udYxD07Do,103159
+pip/_vendor/chardet/langhungarianmodel.py,sha256=EKIZs5Z8Y-l6ORDcBzE9htOMMnAnr2j6Wb1PFRBMVxM,107148
+pip/_vendor/chardet/langrussianmodel.py,sha256=TFH-3rTFzbCBF15oasmoqf92FKBnwWY_HaN2ptl5WVo,136898
+pip/_vendor/chardet/langthaimodel.py,sha256=rTzLQ2x_RjQEzZfIksCR--SCFQyuP5eCtQpqxyl5-x8,107695
+pip/_vendor/chardet/langturkishmodel.py,sha256=fWI_tafe_UQ24gdOGqOWy1tnEY2jxKHoi4ueoT3rrrc,100329
+pip/_vendor/chardet/latin1prober.py,sha256=s1SFkEFY2NGe2_9bgX2MhOmyM_U_qSd_jVSdkdSgZxs,5515
+pip/_vendor/chardet/mbcharsetprober.py,sha256=hzFVD-brxTAVLnTAkDqa1ztd6RwGGwb5oAdvhj1-lE8,3504
+pip/_vendor/chardet/mbcsgroupprober.py,sha256=DlT-X7KRUl5y3SWJNqF1NXqvkjVc47jPKjJ2j4KVs3A,2066
+pip/_vendor/chardet/mbcssm.py,sha256=LGUDh1VB61rWsZB4QlJBzaCjI2PUEUgbBc91gPlX4DQ,26053
+pip/_vendor/chardet/metadata/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+pip/_vendor/chardet/metadata/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/chardet/metadata/__pycache__/languages.cpython-39.pyc,,
+pip/_vendor/chardet/metadata/languages.py,sha256=pGf_EnapgynSUCViRjUcwEi7AWw_bYPJFHCqerAFSbQ,19784
+pip/_vendor/chardet/sbcharsetprober.py,sha256=VPAZ5z-o8ixIIfEGTScLVXeQxkd3Zqi1eceerr0rb78,6281
+pip/_vendor/chardet/sbcsgroupprober.py,sha256=p8XICsXYXOF78Anypfvdne8K_0p8qFC-SUF5nwD1fo4,4392
+pip/_vendor/chardet/sjisprober.py,sha256=1WGev_SSHpa7AVXmM0DIONl1OvyKc8mdydUNaKtGGNI,3866
+pip/_vendor/chardet/universaldetector.py,sha256=C3ryFrDZ9JuroNMdYwgDa2_zAYJlWuPHyHLX5WtCY-g,12789
+pip/_vendor/chardet/utf8prober.py,sha256=rGwn69WfIvmibp0sWvYuH_TPoXs7zzwKHTX79Ojbr9o,2848
+pip/_vendor/chardet/version.py,sha256=LCY3oiBIflXJGeBYm7ly2aw6P9n272rhp3t7qz3oOHo,251
+pip/_vendor/colorama/__init__.py,sha256=besK61Glmusp-wZ1wjjSlsPKEY_6zndaeulh1FkVStw,245
+pip/_vendor/colorama/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/colorama/__pycache__/ansi.cpython-39.pyc,,
+pip/_vendor/colorama/__pycache__/ansitowin32.cpython-39.pyc,,
+pip/_vendor/colorama/__pycache__/initialise.cpython-39.pyc,,
+pip/_vendor/colorama/__pycache__/win32.cpython-39.pyc,,
+pip/_vendor/colorama/__pycache__/winterm.cpython-39.pyc,,
+pip/_vendor/colorama/ansi.py,sha256=121ZIWJSdXR76TcqKXusVZQRgyb0AIlRnf5EW6oSGlQ,2624
+pip/_vendor/colorama/ansitowin32.py,sha256=bZByVMjpiUp-LSAK21KNvCh63UN9CPkXdHFPUsq20kA,10775
+pip/_vendor/colorama/initialise.py,sha256=J92wwYPAAEgdlAyw-ady4JJxl1j9UmXPodi0HicWDwg,1995
+pip/_vendor/colorama/win32.py,sha256=fI0Ani_DO_cYDAbHz_a0BsMbDKHCA1-P3PGcj0eDCmA,5556
+pip/_vendor/colorama/winterm.py,sha256=Zurpa5AEwarU62JTuERX53gGelEWH5SBUiAXN4CxMtA,6607
+pip/_vendor/contextlib2.py,sha256=t6Fla8KtAzH4ERLcdAfXizvnpp4nOw9GCq4GYFwTHkg,17433
+pip/_vendor/distlib/__init__.py,sha256=VmyMfsxv7AYUwPUA52UN_a1GzhtKpSpF7zM7y0G6ocA,604
+pip/_vendor/distlib/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/distlib/__pycache__/compat.cpython-39.pyc,,
+pip/_vendor/distlib/__pycache__/database.cpython-39.pyc,,
+pip/_vendor/distlib/__pycache__/index.cpython-39.pyc,,
+pip/_vendor/distlib/__pycache__/locators.cpython-39.pyc,,
+pip/_vendor/distlib/__pycache__/manifest.cpython-39.pyc,,
+pip/_vendor/distlib/__pycache__/markers.cpython-39.pyc,,
+pip/_vendor/distlib/__pycache__/metadata.cpython-39.pyc,,
+pip/_vendor/distlib/__pycache__/resources.cpython-39.pyc,,
+pip/_vendor/distlib/__pycache__/scripts.cpython-39.pyc,,
+pip/_vendor/distlib/__pycache__/util.cpython-39.pyc,,
+pip/_vendor/distlib/__pycache__/version.cpython-39.pyc,,
+pip/_vendor/distlib/__pycache__/wheel.cpython-39.pyc,,
+pip/_vendor/distlib/_backport/__init__.py,sha256=XkACqtjaFfFn1QQBFDNxSqhMva0LqXeeh6H3fVwwLQ4,280
+pip/_vendor/distlib/_backport/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/distlib/_backport/__pycache__/misc.cpython-39.pyc,,
+pip/_vendor/distlib/_backport/__pycache__/shutil.cpython-39.pyc,,
+pip/_vendor/distlib/_backport/__pycache__/sysconfig.cpython-39.pyc,,
+pip/_vendor/distlib/_backport/__pycache__/tarfile.cpython-39.pyc,,
+pip/_vendor/distlib/_backport/misc.py,sha256=focjmI7975W3LgEtiNC99lvxohfZdsNSLTakOcPNShs,1012
+pip/_vendor/distlib/_backport/shutil.py,sha256=h-yIttFtLq-_LKn5lLn4beHXzRwcmo2wEg4UKU7hX6E,26471
+pip/_vendor/distlib/_backport/sysconfig.cfg,sha256=LoipPkR2PfCKC7JUQBGxp6OFVlWIiWBXT-rNuzv8acU,2701
+pip/_vendor/distlib/_backport/sysconfig.py,sha256=qV5ZK6YVkHS-gUFacIT2TpFBw7bZJFH3DYa8PbT6O54,27640
+pip/_vendor/distlib/_backport/tarfile.py,sha256=fzwGLsCdTmO8uzoHjyjSgu4-srrDQEAcw4jNKUfvQH0,95235
+pip/_vendor/distlib/compat.py,sha256=Z8PBQ-ZPCJuRvzs5rtHuzceFOB8iYV8HHjAGrW3SQ8s,42528
+pip/_vendor/distlib/database.py,sha256=m_LtL3siDUdcSvftoTnXcjhUJA-WZhDwTvHO7rg72SA,52398
+pip/_vendor/distlib/index.py,sha256=MYT9QkE79nX-D9fz1tBpl6YHHmq4uSO95Sp-Gq6dN7E,21582
+pip/_vendor/distlib/locators.py,sha256=DMRfq00jgdPDwelULclHE8qbjNVqGCBoTOXl2kfiwMY,53402
+pip/_vendor/distlib/manifest.py,sha256=0TlGw5ZyFp8wxr_GJz7tAAXGYwUJvceMIOsh9ydAXpM,15204
+pip/_vendor/distlib/markers.py,sha256=k4Fx6LHfaIaX1eOIoaWK_-o-zE8zoT5rXwb6mbnLoXk,4518
+pip/_vendor/distlib/metadata.py,sha256=E3b0ee3kUoNbawem10Mc6qGCBNCUxFvS4TkYKUX8z2Q,40018
+pip/_vendor/distlib/resources.py,sha256=5Xn4ehSMQKsu6kf4gxIsMvy668RRvtL0XwUPytyviPE,11121
+pip/_vendor/distlib/scripts.py,sha256=oGaqPfOX_wcLXbzW2xf8ojJQbU9aJ29QiUgslWNHncM,17599
+pip/_vendor/distlib/t32.exe,sha256=NS3xBCVAld35JVFNmb-1QRyVtThukMrwZVeXn4LhaEQ,96768
+pip/_vendor/distlib/t64.exe,sha256=oAqHes78rUWVM0OtVqIhUvequl_PKhAhXYQWnUf7zR0,105984
+pip/_vendor/distlib/util.py,sha256=vjN27blgrxQkPPiBbAhEbdiv_Xw0ogu4XAT9SgU3x-c,61606
+pip/_vendor/distlib/version.py,sha256=tFjbWEAxyeCDw0dWQDJsWsu9EzegUI5Yhm3IBu2x8hY,24127
+pip/_vendor/distlib/w32.exe,sha256=lJtnZdeUxTZWya_EW5DZos_K5rswRECGspIl8ZJCIXs,90112
+pip/_vendor/distlib/w64.exe,sha256=0aRzoN2BO9NWW4ENy4_4vHkHR4qZTFZNVSAJJYlODTI,99840
+pip/_vendor/distlib/wheel.py,sha256=u8_DwGV_j2-fxQRizS3V9OTioXV-IZ6EC-n6yOnjUfc,42162
+pip/_vendor/distro.py,sha256=ni3ahks9qSr3P1FMur9zTPEF_xcAdaHW8iWZWqwB5mU,44858
+pip/_vendor/html5lib/__init__.py,sha256=Bmlpvs5dN2GoaWRAvN2UZ1yF_p7xb2zROelA0QxBKis,1195
+pip/_vendor/html5lib/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/html5lib/__pycache__/_ihatexml.cpython-39.pyc,,
+pip/_vendor/html5lib/__pycache__/_inputstream.cpython-39.pyc,,
+pip/_vendor/html5lib/__pycache__/_tokenizer.cpython-39.pyc,,
+pip/_vendor/html5lib/__pycache__/_utils.cpython-39.pyc,,
+pip/_vendor/html5lib/__pycache__/constants.cpython-39.pyc,,
+pip/_vendor/html5lib/__pycache__/html5parser.cpython-39.pyc,,
+pip/_vendor/html5lib/__pycache__/serializer.cpython-39.pyc,,
+pip/_vendor/html5lib/_ihatexml.py,sha256=IyMKE35pNPCYYGs290_oSUhWXF1BQZsbVcXBzGuFvl4,17017
+pip/_vendor/html5lib/_inputstream.py,sha256=EA6Wj46jxuK6544Vnk9YOjIpFwGbfJW0Ar2cMH1H0VU,33271
+pip/_vendor/html5lib/_tokenizer.py,sha256=BUDNWZENVB0oFBiKR49sZsqQU4rzLLa13-byISlYRfA,78775
+pip/_vendor/html5lib/_trie/__init__.py,sha256=kfSo27BaU64El8U7bg4ugLmI3Ksywu54xE6BlhVgggA,114
+pip/_vendor/html5lib/_trie/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/html5lib/_trie/__pycache__/_base.cpython-39.pyc,,
+pip/_vendor/html5lib/_trie/__pycache__/py.cpython-39.pyc,,
+pip/_vendor/html5lib/_trie/_base.py,sha256=LTpLNz1pn7LAcfn2TFvRp4moVPbFTkkbhzjPKUrvGes,1053
+pip/_vendor/html5lib/_trie/py.py,sha256=LmuYcbypKw-aMLcT0-IY6WewATGzg1QRkmyd8hTBQeY,1842
+pip/_vendor/html5lib/_utils.py,sha256=dLFxoZDTv5r38HOIHy45uxWwUY7VhLgbEFWNQw6Wppo,5090
+pip/_vendor/html5lib/constants.py,sha256=P9n6_ScDgAFkst0YfKaB-yaAlxVtUS9uMn5Lh8ywbQo,86410
+pip/_vendor/html5lib/filters/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+pip/_vendor/html5lib/filters/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/html5lib/filters/__pycache__/alphabeticalattributes.cpython-39.pyc,,
+pip/_vendor/html5lib/filters/__pycache__/base.cpython-39.pyc,,
+pip/_vendor/html5lib/filters/__pycache__/inject_meta_charset.cpython-39.pyc,,
+pip/_vendor/html5lib/filters/__pycache__/lint.cpython-39.pyc,,
+pip/_vendor/html5lib/filters/__pycache__/optionaltags.cpython-39.pyc,,
+pip/_vendor/html5lib/filters/__pycache__/sanitizer.cpython-39.pyc,,
+pip/_vendor/html5lib/filters/__pycache__/whitespace.cpython-39.pyc,,
+pip/_vendor/html5lib/filters/alphabeticalattributes.py,sha256=0TV6VWJzhNkcLFiR7BNZUJsTJgAEEyZ02in6-PuL2gU,948
+pip/_vendor/html5lib/filters/base.py,sha256=6D2t423hbOLtjnvAAOrs1mWX1vsabMLBrWQF67ITPho,298
+pip/_vendor/html5lib/filters/inject_meta_charset.py,sha256=J-W5X3LyosH1sUipiHU1x-2ocd7g9JSudpIek_QlCUU,3018
+pip/_vendor/html5lib/filters/lint.py,sha256=O6sK29HXXW02Nv-EIEOfGvdQMuXxWvBePu2sQ2ecbJc,3736
+pip/_vendor/html5lib/filters/optionaltags.py,sha256=IVHcJ35kr6_MYBqahFMIK-Gel-ALLUk6Wk9X-or_yXk,10795
+pip/_vendor/html5lib/filters/sanitizer.py,sha256=uwT0HNJHjnw3Omf2LpmvfoVdIgAWb9_3VrMcWD1es_M,27813
+pip/_vendor/html5lib/filters/whitespace.py,sha256=bCC0mMQZicbq8HCg67pip_oScN5Fz_KkkvldfE137Kw,1252
+pip/_vendor/html5lib/html5parser.py,sha256=2xGZMaUvdkuuswAmpkazK1CXHT_y3-XTy4lS71PYUuU,119981
+pip/_vendor/html5lib/serializer.py,sha256=vMivcnRcQxjCSTrbMFdevLMhJ2HbF0cfv_CkroTODZM,16168
+pip/_vendor/html5lib/treeadapters/__init__.py,sha256=76InX2oJAx-C4rGAJziZsoE_CHI8_3thl6TeMgP-ypk,709
+pip/_vendor/html5lib/treeadapters/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/html5lib/treeadapters/__pycache__/genshi.cpython-39.pyc,,
+pip/_vendor/html5lib/treeadapters/__pycache__/sax.cpython-39.pyc,,
+pip/_vendor/html5lib/treeadapters/genshi.py,sha256=nQHNa4Hu0IMpu4bqHbJJS3_Cd1pKXgDO1pgMZ6gADDg,1769
+pip/_vendor/html5lib/treeadapters/sax.py,sha256=PAmV6NG9BSpfMHUY72bDbXwAe6Q2tPn1BC2yAD-K1G0,1826
+pip/_vendor/html5lib/treebuilders/__init__.py,sha256=zfrXDjeqDo2M7cJFax6hRJs70Az4pfHFiZbuLOZ9YE4,3680
+pip/_vendor/html5lib/treebuilders/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/html5lib/treebuilders/__pycache__/base.cpython-39.pyc,,
+pip/_vendor/html5lib/treebuilders/__pycache__/dom.cpython-39.pyc,,
+pip/_vendor/html5lib/treebuilders/__pycache__/etree.cpython-39.pyc,,
+pip/_vendor/html5lib/treebuilders/__pycache__/etree_lxml.cpython-39.pyc,,
+pip/_vendor/html5lib/treebuilders/base.py,sha256=Yao9LOJd-4KaLEx-3ysqRkAkhv1YaDqhTksvX6nuQyY,14982
+pip/_vendor/html5lib/treebuilders/dom.py,sha256=QWkBtUprtDosTiTFlIY6QpgKwk2-pD0AV84qVTNgiLo,9164
+pip/_vendor/html5lib/treebuilders/etree.py,sha256=k-LHrme562Hd5GmIi87r1_vfF25MtURGPurT3mAp8sY,13179
+pip/_vendor/html5lib/treebuilders/etree_lxml.py,sha256=CviyyGjvv2TwN-m47DC8DFWdx0Gt-atRw9jMTv4v8-Q,15158
+pip/_vendor/html5lib/treewalkers/__init__.py,sha256=buyxCJb9LFfJ_1ZIMdc-Do1zV93Uw-7L942o2H-Swy0,5873
+pip/_vendor/html5lib/treewalkers/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/html5lib/treewalkers/__pycache__/base.cpython-39.pyc,,
+pip/_vendor/html5lib/treewalkers/__pycache__/dom.cpython-39.pyc,,
+pip/_vendor/html5lib/treewalkers/__pycache__/etree.cpython-39.pyc,,
+pip/_vendor/html5lib/treewalkers/__pycache__/etree_lxml.cpython-39.pyc,,
+pip/_vendor/html5lib/treewalkers/__pycache__/genshi.cpython-39.pyc,,
+pip/_vendor/html5lib/treewalkers/base.py,sha256=g-cLq7VStBtpZZZ1v_Tbwp3GhJjQ2oG5njgeHVhAaXE,7728
+pip/_vendor/html5lib/treewalkers/dom.py,sha256=fBJht3gn5a6y1WN2KE9gsUru158yTQ0KikT3vOM7Xc4,1456
+pip/_vendor/html5lib/treewalkers/etree.py,sha256=VtcKOS13qy9nv2PAaYoB1j9V1Z8n9o0AEA9KoGAgYOg,4682
+pip/_vendor/html5lib/treewalkers/etree_lxml.py,sha256=u9X06RqSrHanDb0qGI-v8I99-PqzOzmnpZOspHHz_Io,6572
+pip/_vendor/html5lib/treewalkers/genshi.py,sha256=P_2Tnc2GkbWJfuodXN9oYIg6kN9E26aWXXe9iL0_eX4,2378
+pip/_vendor/idna/__init__.py,sha256=_0n4R0OXufy1HIcXEOxgJCUCHGDqtazhMdYBVIXc320,60
+pip/_vendor/idna/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/idna/__pycache__/codec.cpython-39.pyc,,
+pip/_vendor/idna/__pycache__/compat.cpython-39.pyc,,
+pip/_vendor/idna/__pycache__/core.cpython-39.pyc,,
+pip/_vendor/idna/__pycache__/idnadata.cpython-39.pyc,,
+pip/_vendor/idna/__pycache__/intranges.cpython-39.pyc,,
+pip/_vendor/idna/__pycache__/package_data.cpython-39.pyc,,
+pip/_vendor/idna/__pycache__/uts46data.cpython-39.pyc,,
+pip/_vendor/idna/codec.py,sha256=NDQdy95NUcd00WV5Qh0QOpZvYJzIpaMVb9ME0hKuQ80,3417
+pip/_vendor/idna/compat.py,sha256=QPaSi9bWqUO7OAXmC0brJFYc1zweHI3JnA7HiM2BlQA,244
+pip/_vendor/idna/core.py,sha256=9uPbfjxEP-fiU9QL8dtxNnaZHyZr7YUtUS1V0GaNB48,12351
+pip/_vendor/idna/idnadata.py,sha256=qUMdMMOBhxlR7CJpeXFUy97pBTZRwhWKa3zIhulao0k,44400
+pip/_vendor/idna/intranges.py,sha256=K5VTgP3Cn6UOQwinqj0O2stySFQoTb8xrLFKyg-hulg,1802
+pip/_vendor/idna/package_data.py,sha256=JS73h8bhkMB0AKLCXZ-Hbt660VMRAFBcP9drX0lX52s,24
+pip/_vendor/idna/uts46data.py,sha256=oxTG_Nku3jRCkXmSOL2yg_TCQHhH43uN2bDtuJ8xoCc,210441
+pip/_vendor/msgpack/__init__.py,sha256=OhoFouHD7wOYMP2PN-Hlyk9RAZw39V-iPTDRsmkoIns,1172
+pip/_vendor/msgpack/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/msgpack/__pycache__/_version.cpython-39.pyc,,
+pip/_vendor/msgpack/__pycache__/exceptions.cpython-39.pyc,,
+pip/_vendor/msgpack/__pycache__/ext.cpython-39.pyc,,
+pip/_vendor/msgpack/__pycache__/fallback.cpython-39.pyc,,
+pip/_vendor/msgpack/_version.py,sha256=qcv5IclQy1PSvtCYDvZyjaUSFWdHPIRzdGjv3YwkKCs,21
+pip/_vendor/msgpack/exceptions.py,sha256=2fCtczricqQgdT3NtW6cTqmZn3WA7GQtmlPuT-NhLyM,1129
+pip/_vendor/msgpack/ext.py,sha256=3Xznjz11nxxfQJe50uLzKDznWOvxOBxWSZ833DL_DDs,6281
+pip/_vendor/msgpack/fallback.py,sha256=ZaNwBMO2hh9WrqHnYqdHJaCv8zzPMnva9YhD5yInTpM,39113
+pip/_vendor/packaging/__about__.py,sha256=h9QOAlgXk51CVUXfD2djDO8X7z2DKjnIxkEcmCHalTc,753
+pip/_vendor/packaging/__init__.py,sha256=UcApkMPyWGcIGgYWGi5lL5uzPYpelyaOPRXhgdUhCiw,588
+pip/_vendor/packaging/__pycache__/__about__.cpython-39.pyc,,
+pip/_vendor/packaging/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/packaging/__pycache__/_compat.cpython-39.pyc,,
+pip/_vendor/packaging/__pycache__/_structures.cpython-39.pyc,,
+pip/_vendor/packaging/__pycache__/_typing.cpython-39.pyc,,
+pip/_vendor/packaging/__pycache__/markers.cpython-39.pyc,,
+pip/_vendor/packaging/__pycache__/requirements.cpython-39.pyc,,
+pip/_vendor/packaging/__pycache__/specifiers.cpython-39.pyc,,
+pip/_vendor/packaging/__pycache__/tags.cpython-39.pyc,,
+pip/_vendor/packaging/__pycache__/utils.cpython-39.pyc,,
+pip/_vendor/packaging/__pycache__/version.cpython-39.pyc,,
+pip/_vendor/packaging/_compat.py,sha256=wtTUbVAZPwwTy4_8aQjCedCpQVcy-CTvCZv1Ri3IvhY,1166
+pip/_vendor/packaging/_structures.py,sha256=0DUfMS4mYkvzf_89F1f5SRSbYtcxdikc3TvzgCnxeo0,2108
+pip/_vendor/packaging/_typing.py,sha256=n1Xr-giO86iFpEEEkYKWWGZetBwnyYbwhcr-EuId0G0,1872
+pip/_vendor/packaging/markers.py,sha256=1Fj8RWPWbNhnOsSZAYqs7JRI6-aOBzEau7u9UcnFKLk,9808
+pip/_vendor/packaging/requirements.py,sha256=Oxps2CfRKfaPNGWICAv5eUeUwddVOmOfHuLKlQ1k6MU,5270
+pip/_vendor/packaging/specifiers.py,sha256=BCbv9EegYKBiwB5qOLkAVK6sAVCrHdGIeVfzzGznn4c,33072
+pip/_vendor/packaging/tags.py,sha256=aOIFGI46FLvkJpDwy858fFdrHbydPRu1caLTkI8UTOo,30427
+pip/_vendor/packaging/utils.py,sha256=QSspLOaGAUqVnR8c1dHpIHIOQwJHydchP7HnnzMCHSY,4523
+pip/_vendor/packaging/version.py,sha256=QmDlgceXJ0sPNCc2Oe4yda6lShIItK7C1nZVmd-Sq5g,16530
+pip/_vendor/pep517/__init__.py,sha256=ure7CT2epH277sv3FqdoG-8BaydDdFnJnU1d4z15NQI,135
+pip/_vendor/pep517/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/pep517/__pycache__/_in_process.cpython-39.pyc,,
+pip/_vendor/pep517/__pycache__/build.cpython-39.pyc,,
+pip/_vendor/pep517/__pycache__/check.cpython-39.pyc,,
+pip/_vendor/pep517/__pycache__/colorlog.cpython-39.pyc,,
+pip/_vendor/pep517/__pycache__/compat.cpython-39.pyc,,
+pip/_vendor/pep517/__pycache__/dirtools.cpython-39.pyc,,
+pip/_vendor/pep517/__pycache__/envbuild.cpython-39.pyc,,
+pip/_vendor/pep517/__pycache__/meta.cpython-39.pyc,,
+pip/_vendor/pep517/__pycache__/wrappers.cpython-39.pyc,,
+pip/_vendor/pep517/_in_process.py,sha256=R6B_Ol8FFxdRdbZ1R35CIL0glBjC-seixM2i0zasCTg,8718
+pip/_vendor/pep517/build.py,sha256=_LmMkH9mASElZ4lRRCwzmIrQedeguL5ocpSO0zPh6M0,3459
+pip/_vendor/pep517/check.py,sha256=qf0B_WXekszLi8IQb6Xv8raz5D5Ra-CdUmFjvnfbwdc,6164
+pip/_vendor/pep517/colorlog.py,sha256=uOdcoDYZ0ocKGRPPs5JgvpLYVDIfoEVvoMpc43ICQFU,4213
+pip/_vendor/pep517/compat.py,sha256=1jqYeQ-XtQzmaxIHxESnGU313ZBanlnusKD2gNBzRKQ,814
+pip/_vendor/pep517/dirtools.py,sha256=hrSzAJOGDo3tXdtCbgJ6LqoLhPVJn6JGmekz1ofLi6o,1173
+pip/_vendor/pep517/envbuild.py,sha256=Ji_P7ePNXexLOSqBlKyoyQqZQXNxF7-Xp3bF5XcsGgM,6208
+pip/_vendor/pep517/meta.py,sha256=ZkHYB0YHt4FDuSE5NdFuVsat3xfZ6LgW6VS6d4D6vIQ,2555
+pip/_vendor/pep517/wrappers.py,sha256=DLtY2zCWCyhWaVv8_AQcdUs0aou704Uos9vlCuiMLuc,11617
+pip/_vendor/pkg_resources/__init__.py,sha256=zeMvnKzGEcWISjTwy6YtFKWamTFJdwBwYjBAFUoyf7A,111573
+pip/_vendor/pkg_resources/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/pkg_resources/__pycache__/py31compat.cpython-39.pyc,,
+pip/_vendor/pkg_resources/py31compat.py,sha256=tzQGe-w8g7GEXb6Ozw2-v8ZHaIygADmw0LAgriYzPAc,585
+pip/_vendor/progress/__init__.py,sha256=YTntFxK5CZDfVAa1b77EbNkWljGqvyM72YKRTHaHap8,5034
+pip/_vendor/progress/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/progress/__pycache__/bar.cpython-39.pyc,,
+pip/_vendor/progress/__pycache__/counter.cpython-39.pyc,,
+pip/_vendor/progress/__pycache__/spinner.cpython-39.pyc,,
+pip/_vendor/progress/bar.py,sha256=evFQod41y2bMU60teK16uV-A5F4yVUehab8dtCiXj1E,2945
+pip/_vendor/progress/counter.py,sha256=c8AdstUGrFQvIQbvtHjjXxZx6LCflrY-a7DVM6IYTBs,1413
+pip/_vendor/progress/spinner.py,sha256=zLcx2RFinPfM6UwveJJrcJ8YABE3aLCAKqQFVD3pHow,1423
+pip/_vendor/pyparsing.py,sha256=lD3A8iEK1JYvnNDP00Cgve4vZjwEFonCvrpo7mEl3Q8,280501
+pip/_vendor/requests/__init__.py,sha256=ZPcnlAopNRpI2-4_FZKv1_SbCBwlwTxi-mKRwZhdPec,4600
+pip/_vendor/requests/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/__version__.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/_internal_utils.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/adapters.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/api.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/auth.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/certs.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/compat.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/cookies.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/exceptions.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/help.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/hooks.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/models.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/packages.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/sessions.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/status_codes.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/structures.cpython-39.pyc,,
+pip/_vendor/requests/__pycache__/utils.cpython-39.pyc,,
+pip/_vendor/requests/__version__.py,sha256=PYPw-iruqHi6_VTCebDNFpTbGld8EbCSw6EuZDH0c28,455
+pip/_vendor/requests/_internal_utils.py,sha256=zDALdxFfs4pAp4CME_TTw2rGyYR2EGBpSehYHgfn8o0,1138
+pip/_vendor/requests/adapters.py,sha256=v-nXh1zlxNzGQWQicaDrnsMmus75p2c99GPOtPl-6uw,22081
+pip/_vendor/requests/api.py,sha256=wQeIxib0gxc8KyQqF3oMwV2r7sszjJc2hbhGV_ZMzFQ,6657
+pip/_vendor/requests/auth.py,sha256=xe7s91xl3ENjQgRlZP3-2KsACnXYHAiLOuHLDw6nyyQ,10512
+pip/_vendor/requests/certs.py,sha256=fFBPJjnP90gWELetFYPbzrsfZgSZopej7XhlkrnVVec,483
+pip/_vendor/requests/compat.py,sha256=xfkhI1O0M1RPT9n92GEeoalPuBOYMsdApi7TONmwWD8,2121
+pip/_vendor/requests/cookies.py,sha256=PIxSKntoUn6l2irwR-CBMgm0scK8s-6yUZzwoCVIAdo,18979
+pip/_vendor/requests/exceptions.py,sha256=PIWWBbIjGPntNY_KDJMYzEqrBCmjh5d3rk7vZt2pXZI,3296
+pip/_vendor/requests/help.py,sha256=cU7c_l65QBsGALbTfqkHIeXpEKJ5cPih6N7Xcj9jjIQ,3697
+pip/_vendor/requests/hooks.py,sha256=LAWGUHI8SB52PkhFYbwyPcT6mWsjuVJeeZpM0RUTADc,791
+pip/_vendor/requests/models.py,sha256=JF52k_hco_uYxvg91Dhhc1c171lf7h6wVbBT0D7wxgA,35329
+pip/_vendor/requests/packages.py,sha256=ry2VlKGoCDdr8ZJyNCXxDcAF1HfENfmoylCK-_VzXh0,711
+pip/_vendor/requests/sessions.py,sha256=mpQg1Iz7Yg_IhfE87tkb2QwC3yS7JwiE3Ewe6mum_iY,30918
+pip/_vendor/requests/status_codes.py,sha256=ef_TQlJHa44J6_nrl_hTw6h7I-oZS8qg2MHCu9YyzYY,4311
+pip/_vendor/requests/structures.py,sha256=ssrNLrH8XELX89hk4yRQYEVeHnbopq1HAJBfgu38bi8,3110
+pip/_vendor/requests/utils.py,sha256=9CyTbt6eajb0LurVm10A9gSOYZ-PNSjEjz3XZ4U7Ywk,31521
+pip/_vendor/resolvelib/__init__.py,sha256=lzKfakTdPCSwU0ka5lroJTWCp5oHH50S35PI79aCufA,563
+pip/_vendor/resolvelib/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/resolvelib/__pycache__/providers.cpython-39.pyc,,
+pip/_vendor/resolvelib/__pycache__/reporters.cpython-39.pyc,,
+pip/_vendor/resolvelib/__pycache__/resolvers.cpython-39.pyc,,
+pip/_vendor/resolvelib/__pycache__/structs.cpython-39.pyc,,
+pip/_vendor/resolvelib/compat/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+pip/_vendor/resolvelib/compat/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/resolvelib/compat/__pycache__/collections_abc.cpython-39.pyc,,
+pip/_vendor/resolvelib/compat/collections_abc.py,sha256=MheZFF7RxE2-89xgOpds4n99OEzq1RZTU5q4UVXNnQU,133
+pip/_vendor/resolvelib/providers.py,sha256=K5PbvYNuo9J_CgBDXknQpgNzQLuRucz8cSY-jMAry4o,5210
+pip/_vendor/resolvelib/reporters.py,sha256=Yi7l5VMEKyhL20KIEglPukQHWJHkweV4e4amcJs-4yk,1401
+pip/_vendor/resolvelib/resolvers.py,sha256=CEQp-FpwW9aKbkhrJoBoMp0i6aZx_LW-J_nENmdlU_w,16992
+pip/_vendor/resolvelib/structs.py,sha256=kbTC6heWXe96iLb0F7KdoxoTvmujcTsT5TX-ODuY2qg,4557
+pip/_vendor/retrying.py,sha256=LfbAQSee7r9F4SHbBcI1OBu7OLSDDr04Qsw9zkuC0Jw,10239
+pip/_vendor/six.py,sha256=PB_L4p2xXUH81qAYSIWp7iQRf-RU858yzM8bUfpyYBY,35141
+pip/_vendor/toml/__init__.py,sha256=mT8JBhpMcIoJeu-CrAAxPwe_d-xt-5pr9T_phq398TA,772
+pip/_vendor/toml/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/toml/__pycache__/decoder.cpython-39.pyc,,
+pip/_vendor/toml/__pycache__/encoder.cpython-39.pyc,,
+pip/_vendor/toml/__pycache__/ordered.cpython-39.pyc,,
+pip/_vendor/toml/__pycache__/tz.cpython-39.pyc,,
+pip/_vendor/toml/decoder.py,sha256=Uxh7DDx_iAkVMDWZsE4kFUZFFIklqOs_sRzFjJ0pwqY,40011
+pip/_vendor/toml/encoder.py,sha256=r7njY4Dgtsae5o5X-WS5LcQeIFguk1b3eqOioqqM1Ck,10268
+pip/_vendor/toml/ordered.py,sha256=byuDm6cI-nc2D37R4ae3soCb-k4pt48LpLh3e8LD2Fw,393
+pip/_vendor/toml/tz.py,sha256=gyy65HjpDD5I7ujKkL5iWLrUj3uFVpS08ls_btZ0uoY,725
+pip/_vendor/urllib3/__init__.py,sha256=FzLqycdKhCzSxjYOPTX50D3qf0lTCe6UgfZdwT-Li7o,2848
+pip/_vendor/urllib3/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/urllib3/__pycache__/_collections.cpython-39.pyc,,
+pip/_vendor/urllib3/__pycache__/_version.cpython-39.pyc,,
+pip/_vendor/urllib3/__pycache__/connection.cpython-39.pyc,,
+pip/_vendor/urllib3/__pycache__/connectionpool.cpython-39.pyc,,
+pip/_vendor/urllib3/__pycache__/exceptions.cpython-39.pyc,,
+pip/_vendor/urllib3/__pycache__/fields.cpython-39.pyc,,
+pip/_vendor/urllib3/__pycache__/filepost.cpython-39.pyc,,
+pip/_vendor/urllib3/__pycache__/poolmanager.cpython-39.pyc,,
+pip/_vendor/urllib3/__pycache__/request.cpython-39.pyc,,
+pip/_vendor/urllib3/__pycache__/response.cpython-39.pyc,,
+pip/_vendor/urllib3/_collections.py,sha256=RQtWWhudTDETvb2BCVqih1QTpXS2Q5HSf77UJY5ditA,11148
+pip/_vendor/urllib3/_version.py,sha256=y3H2R2qrG0QbjqKtuJNDmsD6z1luXDp-kD1fTjDzdGs,65
+pip/_vendor/urllib3/connection.py,sha256=lyJSLSrRVMHfktX6t9Vtvx4yBcAwOvdkSWjk3HhWfkA,19024
+pip/_vendor/urllib3/connectionpool.py,sha256=yFGc0n8ZWlHr7PaXlRGWiRydYOlJb5mVRNjXrgC7q28,38200
+pip/_vendor/urllib3/contrib/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+pip/_vendor/urllib3/contrib/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/urllib3/contrib/__pycache__/_appengine_environ.cpython-39.pyc,,
+pip/_vendor/urllib3/contrib/__pycache__/appengine.cpython-39.pyc,,
+pip/_vendor/urllib3/contrib/__pycache__/ntlmpool.cpython-39.pyc,,
+pip/_vendor/urllib3/contrib/__pycache__/pyopenssl.cpython-39.pyc,,
+pip/_vendor/urllib3/contrib/__pycache__/securetransport.cpython-39.pyc,,
+pip/_vendor/urllib3/contrib/__pycache__/socks.cpython-39.pyc,,
+pip/_vendor/urllib3/contrib/_appengine_environ.py,sha256=POYJSeNWacJYwXQdv0If3v56lcoiHuL6MQE8pwG1Yoc,993
+pip/_vendor/urllib3/contrib/_securetransport/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+pip/_vendor/urllib3/contrib/_securetransport/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/urllib3/contrib/_securetransport/__pycache__/bindings.cpython-39.pyc,,
+pip/_vendor/urllib3/contrib/_securetransport/__pycache__/low_level.cpython-39.pyc,,
+pip/_vendor/urllib3/contrib/_securetransport/bindings.py,sha256=jreOmmwBW-Cio0m7I_OjmP028nqgrGuo_oB2f7Gir3s,18168
+pip/_vendor/urllib3/contrib/_securetransport/low_level.py,sha256=0KKeznz3h0z-SBDbCtGorDfgCgiZ30VQOqkX5ZgaPBY,14304
+pip/_vendor/urllib3/contrib/appengine.py,sha256=GObYFGhIv3PUW1-SRONBUac4kr2ja2dfyZhe1WJb0JY,11348
+pip/_vendor/urllib3/contrib/ntlmpool.py,sha256=xd-sWgSxZh-kNrUhzhcb7bRNiEvywF3GlRGv4xPpDI8,4281
+pip/_vendor/urllib3/contrib/pyopenssl.py,sha256=2EUnc5DS6QpjrnMMCxeT_nVuhP6Kzmw0rbo3aBCddEI,17304
+pip/_vendor/urllib3/contrib/securetransport.py,sha256=-Je5h1SDUr-8rX8dh8UZWsi90qoHkhT_oZhpsCLqwHw,35223
+pip/_vendor/urllib3/contrib/socks.py,sha256=NVZv5069T4TPXMtDnt8revc8Jgee0oxHX-zYeWrP36c,7313
+pip/_vendor/urllib3/exceptions.py,sha256=_ofwiuS3iKNqq2bodJzZ1CIXzm-hVwNJ0WMN5UoOnno,8123
+pip/_vendor/urllib3/fields.py,sha256=0KSfpuXxzXUMLkI2npM9siWOqCJO1H4wCiJN6neVmlA,8853
+pip/_vendor/urllib3/filepost.py,sha256=BVkEES0YAO9tFwXGBj1mD9yO92pRwk4pX5Q6cO5IRb8,2538
+pip/_vendor/urllib3/packages/__init__.py,sha256=FsOIVHqBFBlT3XxZnaD5y2yq0mvtVwmY4kut3GEfcBI,113
+pip/_vendor/urllib3/packages/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/urllib3/packages/__pycache__/six.cpython-39.pyc,,
+pip/_vendor/urllib3/packages/backports/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+pip/_vendor/urllib3/packages/backports/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/urllib3/packages/backports/__pycache__/makefile.cpython-39.pyc,,
+pip/_vendor/urllib3/packages/backports/makefile.py,sha256=DREmQjGcs2LoVH_Q3hrggClhTNdcI5Y3GJglsuihjAM,1468
+pip/_vendor/urllib3/packages/six.py,sha256=41omxbNReajvLUN-9qdHM6iAEisho1JDaZ1krmNu-jE,33557
+pip/_vendor/urllib3/packages/ssl_match_hostname/__init__.py,sha256=ceiD4ynQtrlnos1yI1RSqaETeLiNRzzAtxYsRGApR4s,779
+pip/_vendor/urllib3/packages/ssl_match_hostname/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/urllib3/packages/ssl_match_hostname/__pycache__/_implementation.cpython-39.pyc,,
+pip/_vendor/urllib3/packages/ssl_match_hostname/_implementation.py,sha256=WXs1yNtk9PsYVmeTJQAcqeAm81zbzeabEWWf-xRJSAo,5839
+pip/_vendor/urllib3/poolmanager.py,sha256=blNTYqVqFg9zUGncVtyXk1HQsTxKO1Cy9hfGVLAGvhM,20299
+pip/_vendor/urllib3/request.py,sha256=Fe4bQCUhum8qh3t1dihpSsQwdyfd5nB44cNX8566DmM,6155
+pip/_vendor/urllib3/response.py,sha256=LjfUJBUhwPrJTrGgNI3WoySUizNEPd1Xiv71YxE2J7Y,29024
+pip/_vendor/urllib3/util/__init__.py,sha256=UV_J7p9b29cJXXQ6wTvBZppJDLUeKQ6mcv0v1ptl13c,1204
+pip/_vendor/urllib3/util/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/urllib3/util/__pycache__/connection.cpython-39.pyc,,
+pip/_vendor/urllib3/util/__pycache__/proxy.cpython-39.pyc,,
+pip/_vendor/urllib3/util/__pycache__/queue.cpython-39.pyc,,
+pip/_vendor/urllib3/util/__pycache__/request.cpython-39.pyc,,
+pip/_vendor/urllib3/util/__pycache__/response.cpython-39.pyc,,
+pip/_vendor/urllib3/util/__pycache__/retry.cpython-39.pyc,,
+pip/_vendor/urllib3/util/__pycache__/ssl_.cpython-39.pyc,,
+pip/_vendor/urllib3/util/__pycache__/ssltransport.cpython-39.pyc,,
+pip/_vendor/urllib3/util/__pycache__/timeout.cpython-39.pyc,,
+pip/_vendor/urllib3/util/__pycache__/url.cpython-39.pyc,,
+pip/_vendor/urllib3/util/__pycache__/wait.cpython-39.pyc,,
+pip/_vendor/urllib3/util/connection.py,sha256=3mqDvgNGtru8tW3IFIckXj2y-6WsLFIabIRFdNMzoZs,5072
+pip/_vendor/urllib3/util/proxy.py,sha256=xMGYpCWlY1Obf1nod_fhOG3rk3NTUM2q_BJmj6B_NmU,1660
+pip/_vendor/urllib3/util/queue.py,sha256=mY2d0cfoJG51UEKZwk_sJMwYraofNfQWq7Larj9xh_o,520
+pip/_vendor/urllib3/util/request.py,sha256=O-NJTFysuN_wgY33pne8xA1P35qv3R7uh67ER9zwqYM,4266
+pip/_vendor/urllib3/util/response.py,sha256=685vBStgxTo8u3KoOilR6Kuh7IGPZr7TmzrP9awEtqU,3617
+pip/_vendor/urllib3/util/retry.py,sha256=v0qyO6YScY6KUoOmY2e_Q185IgioBJZP-_Ltthymc9Q,21967
+pip/_vendor/urllib3/util/ssl_.py,sha256=bvtqRNwp5hZBdDBhZZtKmie0r2VQZtYyPUKdq8iESGQ,16755
+pip/_vendor/urllib3/util/ssltransport.py,sha256=ALVjoGJbZJgWVjkepN82OR_YJu9-hPF49isTfDARzaM,7153
+pip/_vendor/urllib3/util/timeout.py,sha256=Ym2WjTspeYp4fzcCYGTQ5aSU1neVSMHXBAgDp1rcETw,10271
+pip/_vendor/urllib3/util/url.py,sha256=swNcZAmCREhcoLg-uk7ZhPrPRPGidDTPiAn8CpUG4h8,14411
+pip/_vendor/urllib3/util/wait.py,sha256=qk2qJQNb3FjhOm9lKJtpByhnsLWRVapWdhW_NLr7Eog,5557
+pip/_vendor/vendor.txt,sha256=vZicYA5EfWGG74RBvGLyAfH0dqIZ0KACbWfDJ9IIOZI,412
+pip/_vendor/webencodings/__init__.py,sha256=kG5cBDbIrAtrrdU-h1iSPVYq10ayTFldU1CTRcb1ym4,10921
+pip/_vendor/webencodings/__pycache__/__init__.cpython-39.pyc,,
+pip/_vendor/webencodings/__pycache__/labels.cpython-39.pyc,,
+pip/_vendor/webencodings/__pycache__/mklabels.cpython-39.pyc,,
+pip/_vendor/webencodings/__pycache__/tests.cpython-39.pyc,,
+pip/_vendor/webencodings/__pycache__/x_user_defined.cpython-39.pyc,,
+pip/_vendor/webencodings/labels.py,sha256=e9gPVTA1XNYalJMVVX7lXvb52Kurc4UdnXFJDPcBXFE,9210
+pip/_vendor/webencodings/mklabels.py,sha256=tyhoDDc-TC6kjJY25Qn5TlsyMs2_IyPf_rfh4L6nSrg,1364
+pip/_vendor/webencodings/tests.py,sha256=7J6TdufKEml8sQSWcYEsl-e73QXtPmsIHF6pPT0sq08,6716
+pip/_vendor/webencodings/x_user_defined.py,sha256=CMn5bx2ccJ4y3Bsqf3xC24bYO4ONC3ZY_lv5vrqhKAY,4632
diff --git a/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/WHEEL b/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/WHEEL
new file mode 100644
index 0000000..385faab
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/WHEEL
@@ -0,0 +1,5 @@
+Wheel-Version: 1.0
+Generator: bdist_wheel (0.36.2)
+Root-Is-Purelib: true
+Tag: py3-none-any
+
diff --git a/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/entry_points.txt b/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/entry_points.txt
new file mode 100644
index 0000000..9609f72
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/entry_points.txt
@@ -0,0 +1,5 @@
+[console_scripts]
+pip = pip._internal.cli.main:main
+pip3 = pip._internal.cli.main:main
+pip3.9 = pip._internal.cli.main:main
+
diff --git a/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/top_level.txt b/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/top_level.txt
new file mode 100644
index 0000000..a1b589e
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip-21.0.1.dist-info/top_level.txt
@@ -0,0 +1 @@
+pip
diff --git a/.venv/lib/python3.9/site-packages/pip/__init__.py b/.venv/lib/python3.9/site-packages/pip/__init__.py
new file mode 100644
index 0000000..b46254b
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/__init__.py
@@ -0,0 +1,18 @@
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from typing import List, Optional
+
+
+__version__ = "21.0.1"
+
+
+def main(args=None):
+ # type: (Optional[List[str]]) -> int
+ """This is an internal API only meant for use by pip's own console scripts.
+
+ For additional details, see https://github.com/pypa/pip/issues/7498.
+ """
+ from pip._internal.utils.entrypoints import _wrapper
+
+ return _wrapper(args)
diff --git a/.venv/lib/python3.9/site-packages/pip/__main__.py b/.venv/lib/python3.9/site-packages/pip/__main__.py
new file mode 100644
index 0000000..9bff707
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/__main__.py
@@ -0,0 +1,24 @@
+import os
+import sys
+
+# Remove '' and current working directory from the first entry
+# of sys.path, if present to avoid using current directory
+# in pip commands check, freeze, install, list and show,
+# when invoked as python -m pip
+if sys.path[0] in ('', os.getcwd()):
+ sys.path.pop(0)
+
+# If we are running from a wheel, add the wheel to sys.path
+# This allows the usage python pip-*.whl/pip install pip-*.whl
+if __package__ == '':
+ # __file__ is pip-*.whl/pip/__main__.py
+ # first dirname call strips of '/__main__.py', second strips off '/pip'
+ # Resulting path is the name of the wheel itself
+ # Add that to sys.path so we can import pip
+ path = os.path.dirname(os.path.dirname(__file__))
+ sys.path.insert(0, path)
+
+from pip._internal.cli.main import main as _main
+
+if __name__ == '__main__':
+ sys.exit(_main())
diff --git a/.venv/lib/python3.9/site-packages/pip/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..9690c13
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/__pycache__/__main__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/__pycache__/__main__.cpython-39.pyc
new file mode 100644
index 0000000..ce8f9cb
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/__pycache__/__main__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/__init__.py b/.venv/lib/python3.9/site-packages/pip/_internal/__init__.py
new file mode 100644
index 0000000..9fd2d73
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/__init__.py
@@ -0,0 +1,17 @@
+import pip._internal.utils.inject_securetransport # noqa
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from typing import List, Optional
+
+
+def main(args=None):
+ # type: (Optional[List[str]]) -> int
+ """This is preserved for old console scripts that may still be referencing
+ it.
+
+ For additional details, see https://github.com/pypa/pip/issues/7498.
+ """
+ from pip._internal.utils.entrypoints import _wrapper
+
+ return _wrapper(args)
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..f99624c
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/build_env.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/build_env.cpython-39.pyc
new file mode 100644
index 0000000..3408d08
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/build_env.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/cache.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/cache.cpython-39.pyc
new file mode 100644
index 0000000..98d1884
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/cache.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/configuration.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/configuration.cpython-39.pyc
new file mode 100644
index 0000000..1de073f
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/configuration.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/exceptions.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/exceptions.cpython-39.pyc
new file mode 100644
index 0000000..1500c53
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/exceptions.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/locations.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/locations.cpython-39.pyc
new file mode 100644
index 0000000..9ddd59b
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/locations.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/main.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/main.cpython-39.pyc
new file mode 100644
index 0000000..cb11c60
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/main.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/pyproject.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/pyproject.cpython-39.pyc
new file mode 100644
index 0000000..192b6eb
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/pyproject.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/self_outdated_check.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/self_outdated_check.cpython-39.pyc
new file mode 100644
index 0000000..039ac81
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/self_outdated_check.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/wheel_builder.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/wheel_builder.cpython-39.pyc
new file mode 100644
index 0000000..a3a8cb9
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/__pycache__/wheel_builder.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/build_env.py b/.venv/lib/python3.9/site-packages/pip/_internal/build_env.py
new file mode 100644
index 0000000..f981fab
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/build_env.py
@@ -0,0 +1,242 @@
+"""Build Environment used for isolation during sdist building
+"""
+
+import logging
+import os
+import sys
+import textwrap
+from collections import OrderedDict
+from distutils.sysconfig import get_python_lib
+from sysconfig import get_paths
+
+from pip._vendor.pkg_resources import Requirement, VersionConflict, WorkingSet
+
+from pip import __file__ as pip_location
+from pip._internal.cli.spinners import open_spinner
+from pip._internal.utils.subprocess import call_subprocess
+from pip._internal.utils.temp_dir import TempDirectory, tempdir_kinds
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from types import TracebackType
+ from typing import Iterable, List, Optional, Set, Tuple, Type
+
+ from pip._internal.index.package_finder import PackageFinder
+
+logger = logging.getLogger(__name__)
+
+
+class _Prefix:
+
+ def __init__(self, path):
+ # type: (str) -> None
+ self.path = path
+ self.setup = False
+ self.bin_dir = get_paths(
+ 'nt' if os.name == 'nt' else 'posix_prefix',
+ vars={'base': path, 'platbase': path}
+ )['scripts']
+ # Note: prefer distutils' sysconfig to get the
+ # library paths so PyPy is correctly supported.
+ purelib = get_python_lib(plat_specific=False, prefix=path)
+ platlib = get_python_lib(plat_specific=True, prefix=path)
+ if purelib == platlib:
+ self.lib_dirs = [purelib]
+ else:
+ self.lib_dirs = [purelib, platlib]
+
+
+class BuildEnvironment:
+ """Creates and manages an isolated environment to install build deps
+ """
+
+ def __init__(self):
+ # type: () -> None
+ temp_dir = TempDirectory(
+ kind=tempdir_kinds.BUILD_ENV, globally_managed=True
+ )
+
+ self._prefixes = OrderedDict((
+ (name, _Prefix(os.path.join(temp_dir.path, name)))
+ for name in ('normal', 'overlay')
+ ))
+
+ self._bin_dirs = [] # type: List[str]
+ self._lib_dirs = [] # type: List[str]
+ for prefix in reversed(list(self._prefixes.values())):
+ self._bin_dirs.append(prefix.bin_dir)
+ self._lib_dirs.extend(prefix.lib_dirs)
+
+ # Customize site to:
+ # - ensure .pth files are honored
+ # - prevent access to system site packages
+ system_sites = {
+ os.path.normcase(site) for site in (
+ get_python_lib(plat_specific=False),
+ get_python_lib(plat_specific=True),
+ )
+ }
+ self._site_dir = os.path.join(temp_dir.path, 'site')
+ if not os.path.exists(self._site_dir):
+ os.mkdir(self._site_dir)
+ with open(os.path.join(self._site_dir, 'sitecustomize.py'), 'w') as fp:
+ fp.write(textwrap.dedent(
+ '''
+ import os, site, sys
+
+ # First, drop system-sites related paths.
+ original_sys_path = sys.path[:]
+ known_paths = set()
+ for path in {system_sites!r}:
+ site.addsitedir(path, known_paths=known_paths)
+ system_paths = set(
+ os.path.normcase(path)
+ for path in sys.path[len(original_sys_path):]
+ )
+ original_sys_path = [
+ path for path in original_sys_path
+ if os.path.normcase(path) not in system_paths
+ ]
+ sys.path = original_sys_path
+
+ # Second, add lib directories.
+ # ensuring .pth file are processed.
+ for path in {lib_dirs!r}:
+ assert not path in sys.path
+ site.addsitedir(path)
+ '''
+ ).format(system_sites=system_sites, lib_dirs=self._lib_dirs))
+
+ def __enter__(self):
+ # type: () -> None
+ self._save_env = {
+ name: os.environ.get(name, None)
+ for name in ('PATH', 'PYTHONNOUSERSITE', 'PYTHONPATH')
+ }
+
+ path = self._bin_dirs[:]
+ old_path = self._save_env['PATH']
+ if old_path:
+ path.extend(old_path.split(os.pathsep))
+
+ pythonpath = [self._site_dir]
+
+ os.environ.update({
+ 'PATH': os.pathsep.join(path),
+ 'PYTHONNOUSERSITE': '1',
+ 'PYTHONPATH': os.pathsep.join(pythonpath),
+ })
+
+ def __exit__(
+ self,
+ exc_type, # type: Optional[Type[BaseException]]
+ exc_val, # type: Optional[BaseException]
+ exc_tb # type: Optional[TracebackType]
+ ):
+ # type: (...) -> None
+ for varname, old_value in self._save_env.items():
+ if old_value is None:
+ os.environ.pop(varname, None)
+ else:
+ os.environ[varname] = old_value
+
+ def check_requirements(self, reqs):
+ # type: (Iterable[str]) -> Tuple[Set[Tuple[str, str]], Set[str]]
+ """Return 2 sets:
+ - conflicting requirements: set of (installed, wanted) reqs tuples
+ - missing requirements: set of reqs
+ """
+ missing = set()
+ conflicting = set()
+ if reqs:
+ ws = WorkingSet(self._lib_dirs)
+ for req in reqs:
+ try:
+ if ws.find(Requirement.parse(req)) is None:
+ missing.add(req)
+ except VersionConflict as e:
+ conflicting.add((str(e.args[0].as_requirement()),
+ str(e.args[1])))
+ return conflicting, missing
+
+ def install_requirements(
+ self,
+ finder, # type: PackageFinder
+ requirements, # type: Iterable[str]
+ prefix_as_string, # type: str
+ message # type: str
+ ):
+ # type: (...) -> None
+ prefix = self._prefixes[prefix_as_string]
+ assert not prefix.setup
+ prefix.setup = True
+ if not requirements:
+ return
+ args = [
+ sys.executable, os.path.dirname(pip_location), 'install',
+ '--ignore-installed', '--no-user', '--prefix', prefix.path,
+ '--no-warn-script-location',
+ ] # type: List[str]
+ if logger.getEffectiveLevel() <= logging.DEBUG:
+ args.append('-v')
+ for format_control in ('no_binary', 'only_binary'):
+ formats = getattr(finder.format_control, format_control)
+ args.extend(('--' + format_control.replace('_', '-'),
+ ','.join(sorted(formats or {':none:'}))))
+
+ index_urls = finder.index_urls
+ if index_urls:
+ args.extend(['-i', index_urls[0]])
+ for extra_index in index_urls[1:]:
+ args.extend(['--extra-index-url', extra_index])
+ else:
+ args.append('--no-index')
+ for link in finder.find_links:
+ args.extend(['--find-links', link])
+
+ for host in finder.trusted_hosts:
+ args.extend(['--trusted-host', host])
+ if finder.allow_all_prereleases:
+ args.append('--pre')
+ if finder.prefer_binary:
+ args.append('--prefer-binary')
+ args.append('--')
+ args.extend(requirements)
+ with open_spinner(message) as spinner:
+ call_subprocess(args, spinner=spinner)
+
+
+class NoOpBuildEnvironment(BuildEnvironment):
+ """A no-op drop-in replacement for BuildEnvironment
+ """
+
+ def __init__(self):
+ # type: () -> None
+ pass
+
+ def __enter__(self):
+ # type: () -> None
+ pass
+
+ def __exit__(
+ self,
+ exc_type, # type: Optional[Type[BaseException]]
+ exc_val, # type: Optional[BaseException]
+ exc_tb # type: Optional[TracebackType]
+ ):
+ # type: (...) -> None
+ pass
+
+ def cleanup(self):
+ # type: () -> None
+ pass
+
+ def install_requirements(
+ self,
+ finder, # type: PackageFinder
+ requirements, # type: Iterable[str]
+ prefix_as_string, # type: str
+ message # type: str
+ ):
+ # type: (...) -> None
+ raise NotImplementedError()
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cache.py b/.venv/lib/python3.9/site-packages/pip/_internal/cache.py
new file mode 100644
index 0000000..e41ea42
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/cache.py
@@ -0,0 +1,293 @@
+"""Cache Management
+"""
+
+import hashlib
+import json
+import logging
+import os
+
+from pip._vendor.packaging.tags import interpreter_name, interpreter_version
+from pip._vendor.packaging.utils import canonicalize_name
+
+from pip._internal.exceptions import InvalidWheelFilename
+from pip._internal.models.link import Link
+from pip._internal.models.wheel import Wheel
+from pip._internal.utils.temp_dir import TempDirectory, tempdir_kinds
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+from pip._internal.utils.urls import path_to_url
+
+if MYPY_CHECK_RUNNING:
+ from typing import Any, Dict, List, Optional, Set
+
+ from pip._vendor.packaging.tags import Tag
+
+ from pip._internal.models.format_control import FormatControl
+
+logger = logging.getLogger(__name__)
+
+
+def _hash_dict(d):
+ # type: (Dict[str, str]) -> str
+ """Return a stable sha224 of a dictionary."""
+ s = json.dumps(d, sort_keys=True, separators=(",", ":"), ensure_ascii=True)
+ return hashlib.sha224(s.encode("ascii")).hexdigest()
+
+
+class Cache:
+ """An abstract class - provides cache directories for data from links
+
+
+ :param cache_dir: The root of the cache.
+ :param format_control: An object of FormatControl class to limit
+ binaries being read from the cache.
+ :param allowed_formats: which formats of files the cache should store.
+ ('binary' and 'source' are the only allowed values)
+ """
+
+ def __init__(self, cache_dir, format_control, allowed_formats):
+ # type: (str, FormatControl, Set[str]) -> None
+ super().__init__()
+ assert not cache_dir or os.path.isabs(cache_dir)
+ self.cache_dir = cache_dir or None
+ self.format_control = format_control
+ self.allowed_formats = allowed_formats
+
+ _valid_formats = {"source", "binary"}
+ assert self.allowed_formats.union(_valid_formats) == _valid_formats
+
+ def _get_cache_path_parts(self, link):
+ # type: (Link) -> List[str]
+ """Get parts of part that must be os.path.joined with cache_dir
+ """
+
+ # We want to generate an url to use as our cache key, we don't want to
+ # just re-use the URL because it might have other items in the fragment
+ # and we don't care about those.
+ key_parts = {"url": link.url_without_fragment}
+ if link.hash_name is not None and link.hash is not None:
+ key_parts[link.hash_name] = link.hash
+ if link.subdirectory_fragment:
+ key_parts["subdirectory"] = link.subdirectory_fragment
+
+ # Include interpreter name, major and minor version in cache key
+ # to cope with ill-behaved sdists that build a different wheel
+ # depending on the python version their setup.py is being run on,
+ # and don't encode the difference in compatibility tags.
+ # https://github.com/pypa/pip/issues/7296
+ key_parts["interpreter_name"] = interpreter_name()
+ key_parts["interpreter_version"] = interpreter_version()
+
+ # Encode our key url with sha224, we'll use this because it has similar
+ # security properties to sha256, but with a shorter total output (and
+ # thus less secure). However the differences don't make a lot of
+ # difference for our use case here.
+ hashed = _hash_dict(key_parts)
+
+ # We want to nest the directories some to prevent having a ton of top
+ # level directories where we might run out of sub directories on some
+ # FS.
+ parts = [hashed[:2], hashed[2:4], hashed[4:6], hashed[6:]]
+
+ return parts
+
+ def _get_candidates(self, link, canonical_package_name):
+ # type: (Link, str) -> List[Any]
+ can_not_cache = (
+ not self.cache_dir or
+ not canonical_package_name or
+ not link
+ )
+ if can_not_cache:
+ return []
+
+ formats = self.format_control.get_allowed_formats(
+ canonical_package_name
+ )
+ if not self.allowed_formats.intersection(formats):
+ return []
+
+ candidates = []
+ path = self.get_path_for_link(link)
+ if os.path.isdir(path):
+ for candidate in os.listdir(path):
+ candidates.append((candidate, path))
+ return candidates
+
+ def get_path_for_link(self, link):
+ # type: (Link) -> str
+ """Return a directory to store cached items in for link.
+ """
+ raise NotImplementedError()
+
+ def get(
+ self,
+ link, # type: Link
+ package_name, # type: Optional[str]
+ supported_tags, # type: List[Tag]
+ ):
+ # type: (...) -> Link
+ """Returns a link to a cached item if it exists, otherwise returns the
+ passed link.
+ """
+ raise NotImplementedError()
+
+
+class SimpleWheelCache(Cache):
+ """A cache of wheels for future installs.
+ """
+
+ def __init__(self, cache_dir, format_control):
+ # type: (str, FormatControl) -> None
+ super().__init__(cache_dir, format_control, {"binary"})
+
+ def get_path_for_link(self, link):
+ # type: (Link) -> str
+ """Return a directory to store cached wheels for link
+
+ Because there are M wheels for any one sdist, we provide a directory
+ to cache them in, and then consult that directory when looking up
+ cache hits.
+
+ We only insert things into the cache if they have plausible version
+ numbers, so that we don't contaminate the cache with things that were
+ not unique. E.g. ./package might have dozens of installs done for it
+ and build a version of 0.0...and if we built and cached a wheel, we'd
+ end up using the same wheel even if the source has been edited.
+
+ :param link: The link of the sdist for which this will cache wheels.
+ """
+ parts = self._get_cache_path_parts(link)
+ assert self.cache_dir
+ # Store wheels within the root cache_dir
+ return os.path.join(self.cache_dir, "wheels", *parts)
+
+ def get(
+ self,
+ link, # type: Link
+ package_name, # type: Optional[str]
+ supported_tags, # type: List[Tag]
+ ):
+ # type: (...) -> Link
+ candidates = []
+
+ if not package_name:
+ return link
+
+ canonical_package_name = canonicalize_name(package_name)
+ for wheel_name, wheel_dir in self._get_candidates(
+ link, canonical_package_name
+ ):
+ try:
+ wheel = Wheel(wheel_name)
+ except InvalidWheelFilename:
+ continue
+ if canonicalize_name(wheel.name) != canonical_package_name:
+ logger.debug(
+ "Ignoring cached wheel %s for %s as it "
+ "does not match the expected distribution name %s.",
+ wheel_name, link, package_name,
+ )
+ continue
+ if not wheel.supported(supported_tags):
+ # Built for a different python/arch/etc
+ continue
+ candidates.append(
+ (
+ wheel.support_index_min(supported_tags),
+ wheel_name,
+ wheel_dir,
+ )
+ )
+
+ if not candidates:
+ return link
+
+ _, wheel_name, wheel_dir = min(candidates)
+ return Link(path_to_url(os.path.join(wheel_dir, wheel_name)))
+
+
+class EphemWheelCache(SimpleWheelCache):
+ """A SimpleWheelCache that creates it's own temporary cache directory
+ """
+
+ def __init__(self, format_control):
+ # type: (FormatControl) -> None
+ self._temp_dir = TempDirectory(
+ kind=tempdir_kinds.EPHEM_WHEEL_CACHE,
+ globally_managed=True,
+ )
+
+ super().__init__(self._temp_dir.path, format_control)
+
+
+class CacheEntry:
+ def __init__(
+ self,
+ link, # type: Link
+ persistent, # type: bool
+ ):
+ self.link = link
+ self.persistent = persistent
+
+
+class WheelCache(Cache):
+ """Wraps EphemWheelCache and SimpleWheelCache into a single Cache
+
+ This Cache allows for gracefully degradation, using the ephem wheel cache
+ when a certain link is not found in the simple wheel cache first.
+ """
+
+ def __init__(self, cache_dir, format_control):
+ # type: (str, FormatControl) -> None
+ super().__init__(cache_dir, format_control, {'binary'})
+ self._wheel_cache = SimpleWheelCache(cache_dir, format_control)
+ self._ephem_cache = EphemWheelCache(format_control)
+
+ def get_path_for_link(self, link):
+ # type: (Link) -> str
+ return self._wheel_cache.get_path_for_link(link)
+
+ def get_ephem_path_for_link(self, link):
+ # type: (Link) -> str
+ return self._ephem_cache.get_path_for_link(link)
+
+ def get(
+ self,
+ link, # type: Link
+ package_name, # type: Optional[str]
+ supported_tags, # type: List[Tag]
+ ):
+ # type: (...) -> Link
+ cache_entry = self.get_cache_entry(link, package_name, supported_tags)
+ if cache_entry is None:
+ return link
+ return cache_entry.link
+
+ def get_cache_entry(
+ self,
+ link, # type: Link
+ package_name, # type: Optional[str]
+ supported_tags, # type: List[Tag]
+ ):
+ # type: (...) -> Optional[CacheEntry]
+ """Returns a CacheEntry with a link to a cached item if it exists or
+ None. The cache entry indicates if the item was found in the persistent
+ or ephemeral cache.
+ """
+ retval = self._wheel_cache.get(
+ link=link,
+ package_name=package_name,
+ supported_tags=supported_tags,
+ )
+ if retval is not link:
+ return CacheEntry(retval, persistent=True)
+
+ retval = self._ephem_cache.get(
+ link=link,
+ package_name=package_name,
+ supported_tags=supported_tags,
+ )
+ if retval is not link:
+ return CacheEntry(retval, persistent=False)
+
+ return None
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/__init__.py b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__init__.py
new file mode 100644
index 0000000..5f656e0
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__init__.py
@@ -0,0 +1,4 @@
+"""Subpackage containing all of pip's command line interface related code
+"""
+
+# This file intentionally does not import submodules
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..6a62adb
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/autocompletion.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/autocompletion.cpython-39.pyc
new file mode 100644
index 0000000..cb170e4
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/autocompletion.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/base_command.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/base_command.cpython-39.pyc
new file mode 100644
index 0000000..eed7504
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/base_command.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/cmdoptions.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/cmdoptions.cpython-39.pyc
new file mode 100644
index 0000000..507a983
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/cmdoptions.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/command_context.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/command_context.cpython-39.pyc
new file mode 100644
index 0000000..201e867
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/command_context.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/main.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/main.cpython-39.pyc
new file mode 100644
index 0000000..98c50ca
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/main.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/main_parser.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/main_parser.cpython-39.pyc
new file mode 100644
index 0000000..7c93b62
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/main_parser.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/parser.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/parser.cpython-39.pyc
new file mode 100644
index 0000000..f686bcb
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/parser.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/progress_bars.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/progress_bars.cpython-39.pyc
new file mode 100644
index 0000000..77f2e04
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/progress_bars.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/req_command.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/req_command.cpython-39.pyc
new file mode 100644
index 0000000..6772303
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/req_command.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/spinners.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/spinners.cpython-39.pyc
new file mode 100644
index 0000000..4fe83b9
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/spinners.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/status_codes.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/status_codes.cpython-39.pyc
new file mode 100644
index 0000000..c0f1dea
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/cli/__pycache__/status_codes.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/autocompletion.py b/.venv/lib/python3.9/site-packages/pip/_internal/cli/autocompletion.py
new file mode 100644
index 0000000..4c51dad
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/cli/autocompletion.py
@@ -0,0 +1,164 @@
+"""Logic that powers autocompletion installed by ``pip completion``.
+"""
+
+import optparse
+import os
+import sys
+from itertools import chain
+
+from pip._internal.cli.main_parser import create_main_parser
+from pip._internal.commands import commands_dict, create_command
+from pip._internal.utils.misc import get_installed_distributions
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from typing import Any, Iterable, List, Optional
+
+
+def autocomplete():
+ # type: () -> None
+ """Entry Point for completion of main and subcommand options.
+ """
+ # Don't complete if user hasn't sourced bash_completion file.
+ if 'PIP_AUTO_COMPLETE' not in os.environ:
+ return
+ cwords = os.environ['COMP_WORDS'].split()[1:]
+ cword = int(os.environ['COMP_CWORD'])
+ try:
+ current = cwords[cword - 1]
+ except IndexError:
+ current = ''
+
+ parser = create_main_parser()
+ subcommands = list(commands_dict)
+ options = []
+
+ # subcommand
+ subcommand_name = None # type: Optional[str]
+ for word in cwords:
+ if word in subcommands:
+ subcommand_name = word
+ break
+ # subcommand options
+ if subcommand_name is not None:
+ # special case: 'help' subcommand has no options
+ if subcommand_name == 'help':
+ sys.exit(1)
+ # special case: list locally installed dists for show and uninstall
+ should_list_installed = (
+ subcommand_name in ['show', 'uninstall'] and
+ not current.startswith('-')
+ )
+ if should_list_installed:
+ installed = []
+ lc = current.lower()
+ for dist in get_installed_distributions(local_only=True):
+ if dist.key.startswith(lc) and dist.key not in cwords[1:]:
+ installed.append(dist.key)
+ # if there are no dists installed, fall back to option completion
+ if installed:
+ for dist in installed:
+ print(dist)
+ sys.exit(1)
+
+ subcommand = create_command(subcommand_name)
+
+ for opt in subcommand.parser.option_list_all:
+ if opt.help != optparse.SUPPRESS_HELP:
+ for opt_str in opt._long_opts + opt._short_opts:
+ options.append((opt_str, opt.nargs))
+
+ # filter out previously specified options from available options
+ prev_opts = [x.split('=')[0] for x in cwords[1:cword - 1]]
+ options = [(x, v) for (x, v) in options if x not in prev_opts]
+ # filter options by current input
+ options = [(k, v) for k, v in options if k.startswith(current)]
+ # get completion type given cwords and available subcommand options
+ completion_type = get_path_completion_type(
+ cwords, cword, subcommand.parser.option_list_all,
+ )
+ # get completion files and directories if ``completion_type`` is
+ # ````, ```` or ````
+ if completion_type:
+ paths = auto_complete_paths(current, completion_type)
+ options = [(path, 0) for path in paths]
+ for option in options:
+ opt_label = option[0]
+ # append '=' to options which require args
+ if option[1] and option[0][:2] == "--":
+ opt_label += '='
+ print(opt_label)
+ else:
+ # show main parser options only when necessary
+
+ opts = [i.option_list for i in parser.option_groups]
+ opts.append(parser.option_list)
+ flattened_opts = chain.from_iterable(opts)
+ if current.startswith('-'):
+ for opt in flattened_opts:
+ if opt.help != optparse.SUPPRESS_HELP:
+ subcommands += opt._long_opts + opt._short_opts
+ else:
+ # get completion type given cwords and all available options
+ completion_type = get_path_completion_type(cwords, cword,
+ flattened_opts)
+ if completion_type:
+ subcommands = list(auto_complete_paths(current,
+ completion_type))
+
+ print(' '.join([x for x in subcommands if x.startswith(current)]))
+ sys.exit(1)
+
+
+def get_path_completion_type(cwords, cword, opts):
+ # type: (List[str], int, Iterable[Any]) -> Optional[str]
+ """Get the type of path completion (``file``, ``dir``, ``path`` or None)
+
+ :param cwords: same as the environmental variable ``COMP_WORDS``
+ :param cword: same as the environmental variable ``COMP_CWORD``
+ :param opts: The available options to check
+ :return: path completion type (``file``, ``dir``, ``path`` or None)
+ """
+ if cword < 2 or not cwords[cword - 2].startswith('-'):
+ return None
+ for opt in opts:
+ if opt.help == optparse.SUPPRESS_HELP:
+ continue
+ for o in str(opt).split('/'):
+ if cwords[cword - 2].split('=')[0] == o:
+ if not opt.metavar or any(
+ x in ('path', 'file', 'dir')
+ for x in opt.metavar.split('/')):
+ return opt.metavar
+ return None
+
+
+def auto_complete_paths(current, completion_type):
+ # type: (str, str) -> Iterable[str]
+ """If ``completion_type`` is ``file`` or ``path``, list all regular files
+ and directories starting with ``current``; otherwise only list directories
+ starting with ``current``.
+
+ :param current: The word to be completed
+ :param completion_type: path completion type(`file`, `path` or `dir`)i
+ :return: A generator of regular files and/or directories
+ """
+ directory, filename = os.path.split(current)
+ current_path = os.path.abspath(directory)
+ # Don't complete paths if they can't be accessed
+ if not os.access(current_path, os.R_OK):
+ return
+ filename = os.path.normcase(filename)
+ # list all files that start with ``filename``
+ file_list = (x for x in os.listdir(current_path)
+ if os.path.normcase(x).startswith(filename))
+ for f in file_list:
+ opt = os.path.join(current_path, f)
+ comp_file = os.path.normcase(os.path.join(directory, f))
+ # complete regular files when there is not ```` after option
+ # complete directories when there is ````, ```` or
+ # ````after option
+ if completion_type != 'dir' and os.path.isfile(opt):
+ yield comp_file
+ elif os.path.isdir(opt):
+ yield os.path.join(comp_file, '')
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/base_command.py b/.venv/lib/python3.9/site-packages/pip/_internal/cli/base_command.py
new file mode 100644
index 0000000..d6645fc
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/cli/base_command.py
@@ -0,0 +1,226 @@
+"""Base Command class, and related routines"""
+
+import logging
+import logging.config
+import optparse
+import os
+import sys
+import traceback
+
+from pip._internal.cli import cmdoptions
+from pip._internal.cli.command_context import CommandContextMixIn
+from pip._internal.cli.parser import ConfigOptionParser, UpdatingDefaultsHelpFormatter
+from pip._internal.cli.status_codes import (
+ ERROR,
+ PREVIOUS_BUILD_DIR_ERROR,
+ UNKNOWN_ERROR,
+ VIRTUALENV_NOT_FOUND,
+)
+from pip._internal.exceptions import (
+ BadCommand,
+ CommandError,
+ InstallationError,
+ NetworkConnectionError,
+ PreviousBuildDirError,
+ UninstallationError,
+)
+from pip._internal.utils.deprecation import deprecated
+from pip._internal.utils.filesystem import check_path_owner
+from pip._internal.utils.logging import BrokenStdoutLoggingError, setup_logging
+from pip._internal.utils.misc import get_prog, normalize_path
+from pip._internal.utils.temp_dir import global_tempdir_manager, tempdir_registry
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+from pip._internal.utils.virtualenv import running_under_virtualenv
+
+if MYPY_CHECK_RUNNING:
+ from optparse import Values
+ from typing import Any, List, Optional, Tuple
+
+ from pip._internal.utils.temp_dir import (
+ TempDirectoryTypeRegistry as TempDirRegistry,
+ )
+
+__all__ = ['Command']
+
+logger = logging.getLogger(__name__)
+
+
+class Command(CommandContextMixIn):
+ usage = None # type: str
+ ignore_require_venv = False # type: bool
+
+ def __init__(self, name, summary, isolated=False):
+ # type: (str, str, bool) -> None
+ super().__init__()
+ parser_kw = {
+ 'usage': self.usage,
+ 'prog': f'{get_prog()} {name}',
+ 'formatter': UpdatingDefaultsHelpFormatter(),
+ 'add_help_option': False,
+ 'name': name,
+ 'description': self.__doc__,
+ 'isolated': isolated,
+ }
+
+ self.name = name
+ self.summary = summary
+ self.parser = ConfigOptionParser(**parser_kw)
+
+ self.tempdir_registry = None # type: Optional[TempDirRegistry]
+
+ # Commands should add options to this option group
+ optgroup_name = f'{self.name.capitalize()} Options'
+ self.cmd_opts = optparse.OptionGroup(self.parser, optgroup_name)
+
+ # Add the general options
+ gen_opts = cmdoptions.make_option_group(
+ cmdoptions.general_group,
+ self.parser,
+ )
+ self.parser.add_option_group(gen_opts)
+
+ self.add_options()
+
+ def add_options(self):
+ # type: () -> None
+ pass
+
+ def handle_pip_version_check(self, options):
+ # type: (Values) -> None
+ """
+ This is a no-op so that commands by default do not do the pip version
+ check.
+ """
+ # Make sure we do the pip version check if the index_group options
+ # are present.
+ assert not hasattr(options, 'no_index')
+
+ def run(self, options, args):
+ # type: (Values, List[Any]) -> int
+ raise NotImplementedError
+
+ def parse_args(self, args):
+ # type: (List[str]) -> Tuple[Any, Any]
+ # factored out for testability
+ return self.parser.parse_args(args)
+
+ def main(self, args):
+ # type: (List[str]) -> int
+ try:
+ with self.main_context():
+ return self._main(args)
+ finally:
+ logging.shutdown()
+
+ def _main(self, args):
+ # type: (List[str]) -> int
+ # We must initialize this before the tempdir manager, otherwise the
+ # configuration would not be accessible by the time we clean up the
+ # tempdir manager.
+ self.tempdir_registry = self.enter_context(tempdir_registry())
+ # Intentionally set as early as possible so globally-managed temporary
+ # directories are available to the rest of the code.
+ self.enter_context(global_tempdir_manager())
+
+ options, args = self.parse_args(args)
+
+ # Set verbosity so that it can be used elsewhere.
+ self.verbosity = options.verbose - options.quiet
+
+ level_number = setup_logging(
+ verbosity=self.verbosity,
+ no_color=options.no_color,
+ user_log_file=options.log,
+ )
+
+ # TODO: Try to get these passing down from the command?
+ # without resorting to os.environ to hold these.
+ # This also affects isolated builds and it should.
+
+ if options.no_input:
+ os.environ['PIP_NO_INPUT'] = '1'
+
+ if options.exists_action:
+ os.environ['PIP_EXISTS_ACTION'] = ' '.join(options.exists_action)
+
+ if options.require_venv and not self.ignore_require_venv:
+ # If a venv is required check if it can really be found
+ if not running_under_virtualenv():
+ logger.critical(
+ 'Could not find an activated virtualenv (required).'
+ )
+ sys.exit(VIRTUALENV_NOT_FOUND)
+
+ if options.cache_dir:
+ options.cache_dir = normalize_path(options.cache_dir)
+ if not check_path_owner(options.cache_dir):
+ logger.warning(
+ "The directory '%s' or its parent directory is not owned "
+ "or is not writable by the current user. The cache "
+ "has been disabled. Check the permissions and owner of "
+ "that directory. If executing pip with sudo, you may want "
+ "sudo's -H flag.",
+ options.cache_dir,
+ )
+ options.cache_dir = None
+
+ if getattr(options, "build_dir", None):
+ deprecated(
+ reason=(
+ "The -b/--build/--build-dir/--build-directory "
+ "option is deprecated and has no effect anymore."
+ ),
+ replacement=(
+ "use the TMPDIR/TEMP/TMP environment variable, "
+ "possibly combined with --no-clean"
+ ),
+ gone_in="21.1",
+ issue=8333,
+ )
+
+ if '2020-resolver' in options.features_enabled:
+ logger.warning(
+ "--use-feature=2020-resolver no longer has any effect, "
+ "since it is now the default dependency resolver in pip. "
+ "This will become an error in pip 21.0."
+ )
+
+ try:
+ status = self.run(options, args)
+ assert isinstance(status, int)
+ return status
+ except PreviousBuildDirError as exc:
+ logger.critical(str(exc))
+ logger.debug('Exception information:', exc_info=True)
+
+ return PREVIOUS_BUILD_DIR_ERROR
+ except (InstallationError, UninstallationError, BadCommand,
+ NetworkConnectionError) as exc:
+ logger.critical(str(exc))
+ logger.debug('Exception information:', exc_info=True)
+
+ return ERROR
+ except CommandError as exc:
+ logger.critical('%s', exc)
+ logger.debug('Exception information:', exc_info=True)
+
+ return ERROR
+ except BrokenStdoutLoggingError:
+ # Bypass our logger and write any remaining messages to stderr
+ # because stdout no longer works.
+ print('ERROR: Pipe to stdout was broken', file=sys.stderr)
+ if level_number <= logging.DEBUG:
+ traceback.print_exc(file=sys.stderr)
+
+ return ERROR
+ except KeyboardInterrupt:
+ logger.critical('Operation cancelled by user')
+ logger.debug('Exception information:', exc_info=True)
+
+ return ERROR
+ except BaseException:
+ logger.critical('Exception:', exc_info=True)
+
+ return UNKNOWN_ERROR
+ finally:
+ self.handle_pip_version_check(options)
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/cmdoptions.py b/.venv/lib/python3.9/site-packages/pip/_internal/cli/cmdoptions.py
new file mode 100644
index 0000000..16fe14b
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/cli/cmdoptions.py
@@ -0,0 +1,969 @@
+"""
+shared options and groups
+
+The principle here is to define options once, but *not* instantiate them
+globally. One reason being that options with action='append' can carry state
+between parses. pip parses general options twice internally, and shouldn't
+pass on state. To be consistent, all options will follow this design.
+"""
+
+# The following comment should be removed at some point in the future.
+# mypy: strict-optional=False
+
+import os
+import textwrap
+import warnings
+from functools import partial
+from optparse import SUPPRESS_HELP, Option, OptionGroup
+from textwrap import dedent
+
+from pip._vendor.packaging.utils import canonicalize_name
+
+from pip._internal.cli.progress_bars import BAR_TYPES
+from pip._internal.exceptions import CommandError
+from pip._internal.locations import USER_CACHE_DIR, get_src_prefix
+from pip._internal.models.format_control import FormatControl
+from pip._internal.models.index import PyPI
+from pip._internal.models.target_python import TargetPython
+from pip._internal.utils.hashes import STRONG_HASHES
+from pip._internal.utils.misc import strtobool
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from optparse import OptionParser, Values
+ from typing import Any, Callable, Dict, Optional, Tuple
+
+ from pip._internal.cli.parser import ConfigOptionParser
+
+
+def raise_option_error(parser, option, msg):
+ # type: (OptionParser, Option, str) -> None
+ """
+ Raise an option parsing error using parser.error().
+
+ Args:
+ parser: an OptionParser instance.
+ option: an Option instance.
+ msg: the error text.
+ """
+ msg = f'{option} error: {msg}'
+ msg = textwrap.fill(' '.join(msg.split()))
+ parser.error(msg)
+
+
+def make_option_group(group, parser):
+ # type: (Dict[str, Any], ConfigOptionParser) -> OptionGroup
+ """
+ Return an OptionGroup object
+ group -- assumed to be dict with 'name' and 'options' keys
+ parser -- an optparse Parser
+ """
+ option_group = OptionGroup(parser, group['name'])
+ for option in group['options']:
+ option_group.add_option(option())
+ return option_group
+
+
+def check_install_build_global(options, check_options=None):
+ # type: (Values, Optional[Values]) -> None
+ """Disable wheels if per-setup.py call options are set.
+
+ :param options: The OptionParser options to update.
+ :param check_options: The options to check, if not supplied defaults to
+ options.
+ """
+ if check_options is None:
+ check_options = options
+
+ def getname(n):
+ # type: (str) -> Optional[Any]
+ return getattr(check_options, n, None)
+ names = ["build_options", "global_options", "install_options"]
+ if any(map(getname, names)):
+ control = options.format_control
+ control.disallow_binaries()
+ warnings.warn(
+ 'Disabling all use of wheels due to the use of --build-option '
+ '/ --global-option / --install-option.', stacklevel=2,
+ )
+
+
+def check_dist_restriction(options, check_target=False):
+ # type: (Values, bool) -> None
+ """Function for determining if custom platform options are allowed.
+
+ :param options: The OptionParser options.
+ :param check_target: Whether or not to check if --target is being used.
+ """
+ dist_restriction_set = any([
+ options.python_version,
+ options.platforms,
+ options.abis,
+ options.implementation,
+ ])
+
+ binary_only = FormatControl(set(), {':all:'})
+ sdist_dependencies_allowed = (
+ options.format_control != binary_only and
+ not options.ignore_dependencies
+ )
+
+ # Installations or downloads using dist restrictions must not combine
+ # source distributions and dist-specific wheels, as they are not
+ # guaranteed to be locally compatible.
+ if dist_restriction_set and sdist_dependencies_allowed:
+ raise CommandError(
+ "When restricting platform and interpreter constraints using "
+ "--python-version, --platform, --abi, or --implementation, "
+ "either --no-deps must be set, or --only-binary=:all: must be "
+ "set and --no-binary must not be set (or must be set to "
+ ":none:)."
+ )
+
+ if check_target:
+ if dist_restriction_set and not options.target_dir:
+ raise CommandError(
+ "Can not use any platform or abi specific options unless "
+ "installing via '--target'"
+ )
+
+
+def _path_option_check(option, opt, value):
+ # type: (Option, str, str) -> str
+ return os.path.expanduser(value)
+
+
+def _package_name_option_check(option, opt, value):
+ # type: (Option, str, str) -> str
+ return canonicalize_name(value)
+
+
+class PipOption(Option):
+ TYPES = Option.TYPES + ("path", "package_name")
+ TYPE_CHECKER = Option.TYPE_CHECKER.copy()
+ TYPE_CHECKER["package_name"] = _package_name_option_check
+ TYPE_CHECKER["path"] = _path_option_check
+
+
+###########
+# options #
+###########
+
+help_ = partial(
+ Option,
+ '-h', '--help',
+ dest='help',
+ action='help',
+ help='Show help.',
+) # type: Callable[..., Option]
+
+isolated_mode = partial(
+ Option,
+ "--isolated",
+ dest="isolated_mode",
+ action="store_true",
+ default=False,
+ help=(
+ "Run pip in an isolated mode, ignoring environment variables and user "
+ "configuration."
+ ),
+) # type: Callable[..., Option]
+
+require_virtualenv = partial(
+ Option,
+ # Run only if inside a virtualenv, bail if not.
+ '--require-virtualenv', '--require-venv',
+ dest='require_venv',
+ action='store_true',
+ default=False,
+ help=SUPPRESS_HELP
+) # type: Callable[..., Option]
+
+verbose = partial(
+ Option,
+ '-v', '--verbose',
+ dest='verbose',
+ action='count',
+ default=0,
+ help='Give more output. Option is additive, and can be used up to 3 times.'
+) # type: Callable[..., Option]
+
+no_color = partial(
+ Option,
+ '--no-color',
+ dest='no_color',
+ action='store_true',
+ default=False,
+ help="Suppress colored output.",
+) # type: Callable[..., Option]
+
+version = partial(
+ Option,
+ '-V', '--version',
+ dest='version',
+ action='store_true',
+ help='Show version and exit.',
+) # type: Callable[..., Option]
+
+quiet = partial(
+ Option,
+ '-q', '--quiet',
+ dest='quiet',
+ action='count',
+ default=0,
+ help=(
+ 'Give less output. Option is additive, and can be used up to 3'
+ ' times (corresponding to WARNING, ERROR, and CRITICAL logging'
+ ' levels).'
+ ),
+) # type: Callable[..., Option]
+
+progress_bar = partial(
+ Option,
+ '--progress-bar',
+ dest='progress_bar',
+ type='choice',
+ choices=list(BAR_TYPES.keys()),
+ default='on',
+ help=(
+ 'Specify type of progress to be displayed [' +
+ '|'.join(BAR_TYPES.keys()) + '] (default: %default)'
+ ),
+) # type: Callable[..., Option]
+
+log = partial(
+ PipOption,
+ "--log", "--log-file", "--local-log",
+ dest="log",
+ metavar="path",
+ type="path",
+ help="Path to a verbose appending log."
+) # type: Callable[..., Option]
+
+no_input = partial(
+ Option,
+ # Don't ask for input
+ '--no-input',
+ dest='no_input',
+ action='store_true',
+ default=False,
+ help="Disable prompting for input."
+) # type: Callable[..., Option]
+
+proxy = partial(
+ Option,
+ '--proxy',
+ dest='proxy',
+ type='str',
+ default='',
+ help="Specify a proxy in the form [user:passwd@]proxy.server:port."
+) # type: Callable[..., Option]
+
+retries = partial(
+ Option,
+ '--retries',
+ dest='retries',
+ type='int',
+ default=5,
+ help="Maximum number of retries each connection should attempt "
+ "(default %default times).",
+) # type: Callable[..., Option]
+
+timeout = partial(
+ Option,
+ '--timeout', '--default-timeout',
+ metavar='sec',
+ dest='timeout',
+ type='float',
+ default=15,
+ help='Set the socket timeout (default %default seconds).',
+) # type: Callable[..., Option]
+
+
+def exists_action():
+ # type: () -> Option
+ return Option(
+ # Option when path already exist
+ '--exists-action',
+ dest='exists_action',
+ type='choice',
+ choices=['s', 'i', 'w', 'b', 'a'],
+ default=[],
+ action='append',
+ metavar='action',
+ help="Default action when a path already exists: "
+ "(s)witch, (i)gnore, (w)ipe, (b)ackup, (a)bort.",
+ )
+
+
+cert = partial(
+ PipOption,
+ '--cert',
+ dest='cert',
+ type='path',
+ metavar='path',
+ help="Path to alternate CA bundle.",
+) # type: Callable[..., Option]
+
+client_cert = partial(
+ PipOption,
+ '--client-cert',
+ dest='client_cert',
+ type='path',
+ default=None,
+ metavar='path',
+ help="Path to SSL client certificate, a single file containing the "
+ "private key and the certificate in PEM format.",
+) # type: Callable[..., Option]
+
+index_url = partial(
+ Option,
+ '-i', '--index-url', '--pypi-url',
+ dest='index_url',
+ metavar='URL',
+ default=PyPI.simple_url,
+ help="Base URL of the Python Package Index (default %default). "
+ "This should point to a repository compliant with PEP 503 "
+ "(the simple repository API) or a local directory laid out "
+ "in the same format.",
+) # type: Callable[..., Option]
+
+
+def extra_index_url():
+ # type: () -> Option
+ return Option(
+ '--extra-index-url',
+ dest='extra_index_urls',
+ metavar='URL',
+ action='append',
+ default=[],
+ help="Extra URLs of package indexes to use in addition to "
+ "--index-url. Should follow the same rules as "
+ "--index-url.",
+ )
+
+
+no_index = partial(
+ Option,
+ '--no-index',
+ dest='no_index',
+ action='store_true',
+ default=False,
+ help='Ignore package index (only looking at --find-links URLs instead).',
+) # type: Callable[..., Option]
+
+
+def find_links():
+ # type: () -> Option
+ return Option(
+ '-f', '--find-links',
+ dest='find_links',
+ action='append',
+ default=[],
+ metavar='url',
+ help="If a URL or path to an html file, then parse for links to "
+ "archives such as sdist (.tar.gz) or wheel (.whl) files. "
+ "If a local path or file:// URL that's a directory, "
+ "then look for archives in the directory listing. "
+ "Links to VCS project URLs are not supported.",
+ )
+
+
+def trusted_host():
+ # type: () -> Option
+ return Option(
+ "--trusted-host",
+ dest="trusted_hosts",
+ action="append",
+ metavar="HOSTNAME",
+ default=[],
+ help="Mark this host or host:port pair as trusted, even though it "
+ "does not have valid or any HTTPS.",
+ )
+
+
+def constraints():
+ # type: () -> Option
+ return Option(
+ '-c', '--constraint',
+ dest='constraints',
+ action='append',
+ default=[],
+ metavar='file',
+ help='Constrain versions using the given constraints file. '
+ 'This option can be used multiple times.'
+ )
+
+
+def requirements():
+ # type: () -> Option
+ return Option(
+ '-r', '--requirement',
+ dest='requirements',
+ action='append',
+ default=[],
+ metavar='file',
+ help='Install from the given requirements file. '
+ 'This option can be used multiple times.'
+ )
+
+
+def editable():
+ # type: () -> Option
+ return Option(
+ '-e', '--editable',
+ dest='editables',
+ action='append',
+ default=[],
+ metavar='path/url',
+ help=('Install a project in editable mode (i.e. setuptools '
+ '"develop mode") from a local project path or a VCS url.'),
+ )
+
+
+def _handle_src(option, opt_str, value, parser):
+ # type: (Option, str, str, OptionParser) -> None
+ value = os.path.abspath(value)
+ setattr(parser.values, option.dest, value)
+
+
+src = partial(
+ PipOption,
+ '--src', '--source', '--source-dir', '--source-directory',
+ dest='src_dir',
+ type='path',
+ metavar='dir',
+ default=get_src_prefix(),
+ action='callback',
+ callback=_handle_src,
+ help='Directory to check out editable projects into. '
+ 'The default in a virtualenv is "/src". '
+ 'The default for global installs is "/src".'
+) # type: Callable[..., Option]
+
+
+def _get_format_control(values, option):
+ # type: (Values, Option) -> Any
+ """Get a format_control object."""
+ return getattr(values, option.dest)
+
+
+def _handle_no_binary(option, opt_str, value, parser):
+ # type: (Option, str, str, OptionParser) -> None
+ existing = _get_format_control(parser.values, option)
+ FormatControl.handle_mutual_excludes(
+ value, existing.no_binary, existing.only_binary,
+ )
+
+
+def _handle_only_binary(option, opt_str, value, parser):
+ # type: (Option, str, str, OptionParser) -> None
+ existing = _get_format_control(parser.values, option)
+ FormatControl.handle_mutual_excludes(
+ value, existing.only_binary, existing.no_binary,
+ )
+
+
+def no_binary():
+ # type: () -> Option
+ format_control = FormatControl(set(), set())
+ return Option(
+ "--no-binary", dest="format_control", action="callback",
+ callback=_handle_no_binary, type="str",
+ default=format_control,
+ help='Do not use binary packages. Can be supplied multiple times, and '
+ 'each time adds to the existing value. Accepts either ":all:" to '
+ 'disable all binary packages, ":none:" to empty the set (notice '
+ 'the colons), or one or more package names with commas between '
+ 'them (no colons). Note that some packages are tricky to compile '
+ 'and may fail to install when this option is used on them.',
+ )
+
+
+def only_binary():
+ # type: () -> Option
+ format_control = FormatControl(set(), set())
+ return Option(
+ "--only-binary", dest="format_control", action="callback",
+ callback=_handle_only_binary, type="str",
+ default=format_control,
+ help='Do not use source packages. Can be supplied multiple times, and '
+ 'each time adds to the existing value. Accepts either ":all:" to '
+ 'disable all source packages, ":none:" to empty the set, or one '
+ 'or more package names with commas between them. Packages '
+ 'without binary distributions will fail to install when this '
+ 'option is used on them.',
+ )
+
+
+platforms = partial(
+ Option,
+ '--platform',
+ dest='platforms',
+ metavar='platform',
+ action='append',
+ default=None,
+ help=("Only use wheels compatible with . Defaults to the "
+ "platform of the running system. Use this option multiple times to "
+ "specify multiple platforms supported by the target interpreter."),
+) # type: Callable[..., Option]
+
+
+# This was made a separate function for unit-testing purposes.
+def _convert_python_version(value):
+ # type: (str) -> Tuple[Tuple[int, ...], Optional[str]]
+ """
+ Convert a version string like "3", "37", or "3.7.3" into a tuple of ints.
+
+ :return: A 2-tuple (version_info, error_msg), where `error_msg` is
+ non-None if and only if there was a parsing error.
+ """
+ if not value:
+ # The empty string is the same as not providing a value.
+ return (None, None)
+
+ parts = value.split('.')
+ if len(parts) > 3:
+ return ((), 'at most three version parts are allowed')
+
+ if len(parts) == 1:
+ # Then we are in the case of "3" or "37".
+ value = parts[0]
+ if len(value) > 1:
+ parts = [value[0], value[1:]]
+
+ try:
+ version_info = tuple(int(part) for part in parts)
+ except ValueError:
+ return ((), 'each version part must be an integer')
+
+ return (version_info, None)
+
+
+def _handle_python_version(option, opt_str, value, parser):
+ # type: (Option, str, str, OptionParser) -> None
+ """
+ Handle a provided --python-version value.
+ """
+ version_info, error_msg = _convert_python_version(value)
+ if error_msg is not None:
+ msg = (
+ 'invalid --python-version value: {!r}: {}'.format(
+ value, error_msg,
+ )
+ )
+ raise_option_error(parser, option=option, msg=msg)
+
+ parser.values.python_version = version_info
+
+
+python_version = partial(
+ Option,
+ '--python-version',
+ dest='python_version',
+ metavar='python_version',
+ action='callback',
+ callback=_handle_python_version, type='str',
+ default=None,
+ help=dedent("""\
+ The Python interpreter version to use for wheel and "Requires-Python"
+ compatibility checks. Defaults to a version derived from the running
+ interpreter. The version can be specified using up to three dot-separated
+ integers (e.g. "3" for 3.0.0, "3.7" for 3.7.0, or "3.7.3"). A major-minor
+ version can also be given as a string without dots (e.g. "37" for 3.7.0).
+ """),
+) # type: Callable[..., Option]
+
+
+implementation = partial(
+ Option,
+ '--implementation',
+ dest='implementation',
+ metavar='implementation',
+ default=None,
+ help=("Only use wheels compatible with Python "
+ "implementation , e.g. 'pp', 'jy', 'cp', "
+ " or 'ip'. If not specified, then the current "
+ "interpreter implementation is used. Use 'py' to force "
+ "implementation-agnostic wheels."),
+) # type: Callable[..., Option]
+
+
+abis = partial(
+ Option,
+ '--abi',
+ dest='abis',
+ metavar='abi',
+ action='append',
+ default=None,
+ help=("Only use wheels compatible with Python abi , e.g. 'pypy_41'. "
+ "If not specified, then the current interpreter abi tag is used. "
+ "Use this option multiple times to specify multiple abis supported "
+ "by the target interpreter. Generally you will need to specify "
+ "--implementation, --platform, and --python-version when using this "
+ "option."),
+) # type: Callable[..., Option]
+
+
+def add_target_python_options(cmd_opts):
+ # type: (OptionGroup) -> None
+ cmd_opts.add_option(platforms())
+ cmd_opts.add_option(python_version())
+ cmd_opts.add_option(implementation())
+ cmd_opts.add_option(abis())
+
+
+def make_target_python(options):
+ # type: (Values) -> TargetPython
+ target_python = TargetPython(
+ platforms=options.platforms,
+ py_version_info=options.python_version,
+ abis=options.abis,
+ implementation=options.implementation,
+ )
+
+ return target_python
+
+
+def prefer_binary():
+ # type: () -> Option
+ return Option(
+ "--prefer-binary",
+ dest="prefer_binary",
+ action="store_true",
+ default=False,
+ help="Prefer older binary packages over newer source packages."
+ )
+
+
+cache_dir = partial(
+ PipOption,
+ "--cache-dir",
+ dest="cache_dir",
+ default=USER_CACHE_DIR,
+ metavar="dir",
+ type='path',
+ help="Store the cache data in ."
+) # type: Callable[..., Option]
+
+
+def _handle_no_cache_dir(option, opt, value, parser):
+ # type: (Option, str, str, OptionParser) -> None
+ """
+ Process a value provided for the --no-cache-dir option.
+
+ This is an optparse.Option callback for the --no-cache-dir option.
+ """
+ # The value argument will be None if --no-cache-dir is passed via the
+ # command-line, since the option doesn't accept arguments. However,
+ # the value can be non-None if the option is triggered e.g. by an
+ # environment variable, like PIP_NO_CACHE_DIR=true.
+ if value is not None:
+ # Then parse the string value to get argument error-checking.
+ try:
+ strtobool(value)
+ except ValueError as exc:
+ raise_option_error(parser, option=option, msg=str(exc))
+
+ # Originally, setting PIP_NO_CACHE_DIR to a value that strtobool()
+ # converted to 0 (like "false" or "no") caused cache_dir to be disabled
+ # rather than enabled (logic would say the latter). Thus, we disable
+ # the cache directory not just on values that parse to True, but (for
+ # backwards compatibility reasons) also on values that parse to False.
+ # In other words, always set it to False if the option is provided in
+ # some (valid) form.
+ parser.values.cache_dir = False
+
+
+no_cache = partial(
+ Option,
+ "--no-cache-dir",
+ dest="cache_dir",
+ action="callback",
+ callback=_handle_no_cache_dir,
+ help="Disable the cache.",
+) # type: Callable[..., Option]
+
+no_deps = partial(
+ Option,
+ '--no-deps', '--no-dependencies',
+ dest='ignore_dependencies',
+ action='store_true',
+ default=False,
+ help="Don't install package dependencies.",
+) # type: Callable[..., Option]
+
+build_dir = partial(
+ PipOption,
+ '-b', '--build', '--build-dir', '--build-directory',
+ dest='build_dir',
+ type='path',
+ metavar='dir',
+ help=SUPPRESS_HELP,
+) # type: Callable[..., Option]
+
+ignore_requires_python = partial(
+ Option,
+ '--ignore-requires-python',
+ dest='ignore_requires_python',
+ action='store_true',
+ help='Ignore the Requires-Python information.'
+) # type: Callable[..., Option]
+
+no_build_isolation = partial(
+ Option,
+ '--no-build-isolation',
+ dest='build_isolation',
+ action='store_false',
+ default=True,
+ help='Disable isolation when building a modern source distribution. '
+ 'Build dependencies specified by PEP 518 must be already installed '
+ 'if this option is used.'
+) # type: Callable[..., Option]
+
+
+def _handle_no_use_pep517(option, opt, value, parser):
+ # type: (Option, str, str, OptionParser) -> None
+ """
+ Process a value provided for the --no-use-pep517 option.
+
+ This is an optparse.Option callback for the no_use_pep517 option.
+ """
+ # Since --no-use-pep517 doesn't accept arguments, the value argument
+ # will be None if --no-use-pep517 is passed via the command-line.
+ # However, the value can be non-None if the option is triggered e.g.
+ # by an environment variable, for example "PIP_NO_USE_PEP517=true".
+ if value is not None:
+ msg = """A value was passed for --no-use-pep517,
+ probably using either the PIP_NO_USE_PEP517 environment variable
+ or the "no-use-pep517" config file option. Use an appropriate value
+ of the PIP_USE_PEP517 environment variable or the "use-pep517"
+ config file option instead.
+ """
+ raise_option_error(parser, option=option, msg=msg)
+
+ # Otherwise, --no-use-pep517 was passed via the command-line.
+ parser.values.use_pep517 = False
+
+
+use_pep517 = partial(
+ Option,
+ '--use-pep517',
+ dest='use_pep517',
+ action='store_true',
+ default=None,
+ help='Use PEP 517 for building source distributions '
+ '(use --no-use-pep517 to force legacy behaviour).'
+) # type: Any
+
+no_use_pep517 = partial(
+ Option,
+ '--no-use-pep517',
+ dest='use_pep517',
+ action='callback',
+ callback=_handle_no_use_pep517,
+ default=None,
+ help=SUPPRESS_HELP
+) # type: Any
+
+install_options = partial(
+ Option,
+ '--install-option',
+ dest='install_options',
+ action='append',
+ metavar='options',
+ help="Extra arguments to be supplied to the setup.py install "
+ "command (use like --install-option=\"--install-scripts=/usr/local/"
+ "bin\"). Use multiple --install-option options to pass multiple "
+ "options to setup.py install. If you are using an option with a "
+ "directory path, be sure to use absolute path.",
+) # type: Callable[..., Option]
+
+global_options = partial(
+ Option,
+ '--global-option',
+ dest='global_options',
+ action='append',
+ metavar='options',
+ help="Extra global options to be supplied to the setup.py "
+ "call before the install command.",
+) # type: Callable[..., Option]
+
+no_clean = partial(
+ Option,
+ '--no-clean',
+ action='store_true',
+ default=False,
+ help="Don't clean up build directories."
+) # type: Callable[..., Option]
+
+pre = partial(
+ Option,
+ '--pre',
+ action='store_true',
+ default=False,
+ help="Include pre-release and development versions. By default, "
+ "pip only finds stable versions.",
+) # type: Callable[..., Option]
+
+disable_pip_version_check = partial(
+ Option,
+ "--disable-pip-version-check",
+ dest="disable_pip_version_check",
+ action="store_true",
+ default=False,
+ help="Don't periodically check PyPI to determine whether a new version "
+ "of pip is available for download. Implied with --no-index.",
+) # type: Callable[..., Option]
+
+
+def _handle_merge_hash(option, opt_str, value, parser):
+ # type: (Option, str, str, OptionParser) -> None
+ """Given a value spelled "algo:digest", append the digest to a list
+ pointed to in a dict by the algo name."""
+ if not parser.values.hashes:
+ parser.values.hashes = {}
+ try:
+ algo, digest = value.split(':', 1)
+ except ValueError:
+ parser.error('Arguments to {} must be a hash name ' # noqa
+ 'followed by a value, like --hash=sha256:'
+ 'abcde...'.format(opt_str))
+ if algo not in STRONG_HASHES:
+ parser.error('Allowed hash algorithms for {} are {}.'.format( # noqa
+ opt_str, ', '.join(STRONG_HASHES)))
+ parser.values.hashes.setdefault(algo, []).append(digest)
+
+
+hash = partial(
+ Option,
+ '--hash',
+ # Hash values eventually end up in InstallRequirement.hashes due to
+ # __dict__ copying in process_line().
+ dest='hashes',
+ action='callback',
+ callback=_handle_merge_hash,
+ type='string',
+ help="Verify that the package's archive matches this "
+ 'hash before installing. Example: --hash=sha256:abcdef...',
+) # type: Callable[..., Option]
+
+
+require_hashes = partial(
+ Option,
+ '--require-hashes',
+ dest='require_hashes',
+ action='store_true',
+ default=False,
+ help='Require a hash to check each requirement against, for '
+ 'repeatable installs. This option is implied when any package in a '
+ 'requirements file has a --hash option.',
+) # type: Callable[..., Option]
+
+
+list_path = partial(
+ PipOption,
+ '--path',
+ dest='path',
+ type='path',
+ action='append',
+ help='Restrict to the specified installation path for listing '
+ 'packages (can be used multiple times).'
+) # type: Callable[..., Option]
+
+
+def check_list_path_option(options):
+ # type: (Values) -> None
+ if options.path and (options.user or options.local):
+ raise CommandError(
+ "Cannot combine '--path' with '--user' or '--local'"
+ )
+
+
+list_exclude = partial(
+ PipOption,
+ '--exclude',
+ dest='excludes',
+ action='append',
+ metavar='package',
+ type='package_name',
+ help="Exclude specified package from the output",
+) # type: Callable[..., Option]
+
+
+no_python_version_warning = partial(
+ Option,
+ '--no-python-version-warning',
+ dest='no_python_version_warning',
+ action='store_true',
+ default=False,
+ help='Silence deprecation warnings for upcoming unsupported Pythons.',
+) # type: Callable[..., Option]
+
+
+use_new_feature = partial(
+ Option,
+ '--use-feature',
+ dest='features_enabled',
+ metavar='feature',
+ action='append',
+ default=[],
+ choices=['2020-resolver', 'fast-deps'],
+ help='Enable new functionality, that may be backward incompatible.',
+) # type: Callable[..., Option]
+
+use_deprecated_feature = partial(
+ Option,
+ '--use-deprecated',
+ dest='deprecated_features_enabled',
+ metavar='feature',
+ action='append',
+ default=[],
+ choices=['legacy-resolver'],
+ help=(
+ 'Enable deprecated functionality, that will be removed in the future.'
+ ),
+) # type: Callable[..., Option]
+
+
+##########
+# groups #
+##########
+
+general_group = {
+ 'name': 'General Options',
+ 'options': [
+ help_,
+ isolated_mode,
+ require_virtualenv,
+ verbose,
+ version,
+ quiet,
+ log,
+ no_input,
+ proxy,
+ retries,
+ timeout,
+ exists_action,
+ trusted_host,
+ cert,
+ client_cert,
+ cache_dir,
+ no_cache,
+ disable_pip_version_check,
+ no_color,
+ no_python_version_warning,
+ use_new_feature,
+ use_deprecated_feature,
+ ]
+} # type: Dict[str, Any]
+
+index_group = {
+ 'name': 'Package Index Options',
+ 'options': [
+ index_url,
+ extra_index_url,
+ no_index,
+ find_links,
+ ]
+} # type: Dict[str, Any]
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/command_context.py b/.venv/lib/python3.9/site-packages/pip/_internal/cli/command_context.py
new file mode 100644
index 0000000..fcd6d07
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/cli/command_context.py
@@ -0,0 +1,36 @@
+from contextlib import contextmanager
+
+from pip._vendor.contextlib2 import ExitStack
+
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from typing import ContextManager, Iterator, TypeVar
+
+ _T = TypeVar('_T', covariant=True)
+
+
+class CommandContextMixIn:
+ def __init__(self):
+ # type: () -> None
+ super().__init__()
+ self._in_main_context = False
+ self._main_context = ExitStack()
+
+ @contextmanager
+ def main_context(self):
+ # type: () -> Iterator[None]
+ assert not self._in_main_context
+
+ self._in_main_context = True
+ try:
+ with self._main_context:
+ yield
+ finally:
+ self._in_main_context = False
+
+ def enter_context(self, context_provider):
+ # type: (ContextManager[_T]) -> _T
+ assert self._in_main_context
+
+ return self._main_context.enter_context(context_provider)
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/main.py b/.venv/lib/python3.9/site-packages/pip/_internal/cli/main.py
new file mode 100644
index 0000000..f850423
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/cli/main.py
@@ -0,0 +1,73 @@
+"""Primary application entrypoint.
+"""
+import locale
+import logging
+import os
+import sys
+
+from pip._internal.cli.autocompletion import autocomplete
+from pip._internal.cli.main_parser import parse_command
+from pip._internal.commands import create_command
+from pip._internal.exceptions import PipError
+from pip._internal.utils import deprecation
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from typing import List, Optional
+
+logger = logging.getLogger(__name__)
+
+
+# Do not import and use main() directly! Using it directly is actively
+# discouraged by pip's maintainers. The name, location and behavior of
+# this function is subject to change, so calling it directly is not
+# portable across different pip versions.
+
+# In addition, running pip in-process is unsupported and unsafe. This is
+# elaborated in detail at
+# https://pip.pypa.io/en/stable/user_guide/#using-pip-from-your-program.
+# That document also provides suggestions that should work for nearly
+# all users that are considering importing and using main() directly.
+
+# However, we know that certain users will still want to invoke pip
+# in-process. If you understand and accept the implications of using pip
+# in an unsupported manner, the best approach is to use runpy to avoid
+# depending on the exact location of this entry point.
+
+# The following example shows how to use runpy to invoke pip in that
+# case:
+#
+# sys.argv = ["pip", your, args, here]
+# runpy.run_module("pip", run_name="__main__")
+#
+# Note that this will exit the process after running, unlike a direct
+# call to main. As it is not safe to do any processing after calling
+# main, this should not be an issue in practice.
+
+def main(args=None):
+ # type: (Optional[List[str]]) -> int
+ if args is None:
+ args = sys.argv[1:]
+
+ # Configure our deprecation warnings to be sent through loggers
+ deprecation.install_warning_logger()
+
+ autocomplete()
+
+ try:
+ cmd_name, cmd_args = parse_command(args)
+ except PipError as exc:
+ sys.stderr.write(f"ERROR: {exc}")
+ sys.stderr.write(os.linesep)
+ sys.exit(1)
+
+ # Needed for locale.getpreferredencoding(False) to work
+ # in pip._internal.utils.encoding.auto_decode
+ try:
+ locale.setlocale(locale.LC_ALL, '')
+ except locale.Error as e:
+ # setlocale can apparently crash if locale are uninitialized
+ logger.debug("Ignoring error %s when setting locale", e)
+ command = create_command(cmd_name, isolated=("--isolated" in cmd_args))
+
+ return command.main(cmd_args)
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/main_parser.py b/.venv/lib/python3.9/site-packages/pip/_internal/cli/main_parser.py
new file mode 100644
index 0000000..7cb2da3
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/cli/main_parser.py
@@ -0,0 +1,96 @@
+"""A single place for constructing and exposing the main parser
+"""
+
+import os
+import sys
+
+from pip._internal.cli import cmdoptions
+from pip._internal.cli.parser import ConfigOptionParser, UpdatingDefaultsHelpFormatter
+from pip._internal.commands import commands_dict, get_similar_commands
+from pip._internal.exceptions import CommandError
+from pip._internal.utils.misc import get_pip_version, get_prog
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from typing import List, Tuple
+
+
+__all__ = ["create_main_parser", "parse_command"]
+
+
+def create_main_parser():
+ # type: () -> ConfigOptionParser
+ """Creates and returns the main parser for pip's CLI
+ """
+
+ parser_kw = {
+ 'usage': '\n%prog [options]',
+ 'add_help_option': False,
+ 'formatter': UpdatingDefaultsHelpFormatter(),
+ 'name': 'global',
+ 'prog': get_prog(),
+ }
+
+ parser = ConfigOptionParser(**parser_kw)
+ parser.disable_interspersed_args()
+
+ parser.version = get_pip_version()
+
+ # add the general options
+ gen_opts = cmdoptions.make_option_group(cmdoptions.general_group, parser)
+ parser.add_option_group(gen_opts)
+
+ # so the help formatter knows
+ parser.main = True # type: ignore
+
+ # create command listing for description
+ description = [''] + [
+ '{name:27} {command_info.summary}'.format(**locals())
+ for name, command_info in commands_dict.items()
+ ]
+ parser.description = '\n'.join(description)
+
+ return parser
+
+
+def parse_command(args):
+ # type: (List[str]) -> Tuple[str, List[str]]
+ parser = create_main_parser()
+
+ # Note: parser calls disable_interspersed_args(), so the result of this
+ # call is to split the initial args into the general options before the
+ # subcommand and everything else.
+ # For example:
+ # args: ['--timeout=5', 'install', '--user', 'INITools']
+ # general_options: ['--timeout==5']
+ # args_else: ['install', '--user', 'INITools']
+ general_options, args_else = parser.parse_args(args)
+
+ # --version
+ if general_options.version:
+ sys.stdout.write(parser.version)
+ sys.stdout.write(os.linesep)
+ sys.exit()
+
+ # pip || pip help -> print_help()
+ if not args_else or (args_else[0] == 'help' and len(args_else) == 1):
+ parser.print_help()
+ sys.exit()
+
+ # the subcommand name
+ cmd_name = args_else[0]
+
+ if cmd_name not in commands_dict:
+ guess = get_similar_commands(cmd_name)
+
+ msg = [f'unknown command "{cmd_name}"']
+ if guess:
+ msg.append(f'maybe you meant "{guess}"')
+
+ raise CommandError(' - '.join(msg))
+
+ # all the args without the subcommand
+ cmd_args = args[:]
+ cmd_args.remove(cmd_name)
+
+ return cmd_name, cmd_args
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/parser.py b/.venv/lib/python3.9/site-packages/pip/_internal/cli/parser.py
new file mode 100644
index 0000000..79e56e8
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/cli/parser.py
@@ -0,0 +1,281 @@
+"""Base option parser setup"""
+
+# The following comment should be removed at some point in the future.
+# mypy: disallow-untyped-defs=False
+
+import logging
+import optparse
+import shutil
+import sys
+import textwrap
+
+from pip._vendor.contextlib2 import suppress
+
+from pip._internal.cli.status_codes import UNKNOWN_ERROR
+from pip._internal.configuration import Configuration, ConfigurationError
+from pip._internal.utils.misc import redact_auth_from_url, strtobool
+
+logger = logging.getLogger(__name__)
+
+
+class PrettyHelpFormatter(optparse.IndentedHelpFormatter):
+ """A prettier/less verbose help formatter for optparse."""
+
+ def __init__(self, *args, **kwargs):
+ # help position must be aligned with __init__.parseopts.description
+ kwargs['max_help_position'] = 30
+ kwargs['indent_increment'] = 1
+ kwargs['width'] = shutil.get_terminal_size()[0] - 2
+ super().__init__(*args, **kwargs)
+
+ def format_option_strings(self, option):
+ return self._format_option_strings(option)
+
+ def _format_option_strings(self, option, mvarfmt=' <{}>', optsep=', '):
+ """
+ Return a comma-separated list of option strings and metavars.
+
+ :param option: tuple of (short opt, long opt), e.g: ('-f', '--format')
+ :param mvarfmt: metavar format string
+ :param optsep: separator
+ """
+ opts = []
+
+ if option._short_opts:
+ opts.append(option._short_opts[0])
+ if option._long_opts:
+ opts.append(option._long_opts[0])
+ if len(opts) > 1:
+ opts.insert(1, optsep)
+
+ if option.takes_value():
+ metavar = option.metavar or option.dest.lower()
+ opts.append(mvarfmt.format(metavar.lower()))
+
+ return ''.join(opts)
+
+ def format_heading(self, heading):
+ if heading == 'Options':
+ return ''
+ return heading + ':\n'
+
+ def format_usage(self, usage):
+ """
+ Ensure there is only one newline between usage and the first heading
+ if there is no description.
+ """
+ msg = '\nUsage: {}\n'.format(
+ self.indent_lines(textwrap.dedent(usage), " "))
+ return msg
+
+ def format_description(self, description):
+ # leave full control over description to us
+ if description:
+ if hasattr(self.parser, 'main'):
+ label = 'Commands'
+ else:
+ label = 'Description'
+ # some doc strings have initial newlines, some don't
+ description = description.lstrip('\n')
+ # some doc strings have final newlines and spaces, some don't
+ description = description.rstrip()
+ # dedent, then reindent
+ description = self.indent_lines(textwrap.dedent(description), " ")
+ description = f'{label}:\n{description}\n'
+ return description
+ else:
+ return ''
+
+ def format_epilog(self, epilog):
+ # leave full control over epilog to us
+ if epilog:
+ return epilog
+ else:
+ return ''
+
+ def indent_lines(self, text, indent):
+ new_lines = [indent + line for line in text.split('\n')]
+ return "\n".join(new_lines)
+
+
+class UpdatingDefaultsHelpFormatter(PrettyHelpFormatter):
+ """Custom help formatter for use in ConfigOptionParser.
+
+ This is updates the defaults before expanding them, allowing
+ them to show up correctly in the help listing.
+
+ Also redact auth from url type options
+ """
+
+ def expand_default(self, option):
+ default_values = None
+ if self.parser is not None:
+ self.parser._update_defaults(self.parser.defaults)
+ default_values = self.parser.defaults.get(option.dest)
+ help_text = super().expand_default(option)
+
+ if default_values and option.metavar == 'URL':
+ if isinstance(default_values, str):
+ default_values = [default_values]
+
+ # If its not a list, we should abort and just return the help text
+ if not isinstance(default_values, list):
+ default_values = []
+
+ for val in default_values:
+ help_text = help_text.replace(
+ val, redact_auth_from_url(val))
+
+ return help_text
+
+
+class CustomOptionParser(optparse.OptionParser):
+
+ def insert_option_group(self, idx, *args, **kwargs):
+ """Insert an OptionGroup at a given position."""
+ group = self.add_option_group(*args, **kwargs)
+
+ self.option_groups.pop()
+ self.option_groups.insert(idx, group)
+
+ return group
+
+ @property
+ def option_list_all(self):
+ """Get a list of all options, including those in option groups."""
+ res = self.option_list[:]
+ for i in self.option_groups:
+ res.extend(i.option_list)
+
+ return res
+
+
+class ConfigOptionParser(CustomOptionParser):
+ """Custom option parser which updates its defaults by checking the
+ configuration files and environmental variables"""
+
+ def __init__(self, *args, **kwargs):
+ self.name = kwargs.pop('name')
+
+ isolated = kwargs.pop("isolated", False)
+ self.config = Configuration(isolated)
+
+ assert self.name
+ super().__init__(*args, **kwargs)
+
+ def check_default(self, option, key, val):
+ try:
+ return option.check_value(key, val)
+ except optparse.OptionValueError as exc:
+ print(f"An error occurred during configuration: {exc}")
+ sys.exit(3)
+
+ def _get_ordered_configuration_items(self):
+ # Configuration gives keys in an unordered manner. Order them.
+ override_order = ["global", self.name, ":env:"]
+
+ # Pool the options into different groups
+ section_items = {name: [] for name in override_order}
+ for section_key, val in self.config.items():
+ # ignore empty values
+ if not val:
+ logger.debug(
+ "Ignoring configuration key '%s' as it's value is empty.",
+ section_key
+ )
+ continue
+
+ section, key = section_key.split(".", 1)
+ if section in override_order:
+ section_items[section].append((key, val))
+
+ # Yield each group in their override order
+ for section in override_order:
+ for key, val in section_items[section]:
+ yield key, val
+
+ def _update_defaults(self, defaults):
+ """Updates the given defaults with values from the config files and
+ the environ. Does a little special handling for certain types of
+ options (lists)."""
+
+ # Accumulate complex default state.
+ self.values = optparse.Values(self.defaults)
+ late_eval = set()
+ # Then set the options with those values
+ for key, val in self._get_ordered_configuration_items():
+ # '--' because configuration supports only long names
+ option = self.get_option('--' + key)
+
+ # Ignore options not present in this parser. E.g. non-globals put
+ # in [global] by users that want them to apply to all applicable
+ # commands.
+ if option is None:
+ continue
+
+ if option.action in ('store_true', 'store_false'):
+ try:
+ val = strtobool(val)
+ except ValueError:
+ self.error(
+ '{} is not a valid value for {} option, ' # noqa
+ 'please specify a boolean value like yes/no, '
+ 'true/false or 1/0 instead.'.format(val, key)
+ )
+ elif option.action == 'count':
+ with suppress(ValueError):
+ val = strtobool(val)
+ with suppress(ValueError):
+ val = int(val)
+ if not isinstance(val, int) or val < 0:
+ self.error(
+ '{} is not a valid value for {} option, ' # noqa
+ 'please instead specify either a non-negative integer '
+ 'or a boolean value like yes/no or false/true '
+ 'which is equivalent to 1/0.'.format(val, key)
+ )
+ elif option.action == 'append':
+ val = val.split()
+ val = [self.check_default(option, key, v) for v in val]
+ elif option.action == 'callback':
+ late_eval.add(option.dest)
+ opt_str = option.get_opt_string()
+ val = option.convert_value(opt_str, val)
+ # From take_action
+ args = option.callback_args or ()
+ kwargs = option.callback_kwargs or {}
+ option.callback(option, opt_str, val, self, *args, **kwargs)
+ else:
+ val = self.check_default(option, key, val)
+
+ defaults[option.dest] = val
+
+ for key in late_eval:
+ defaults[key] = getattr(self.values, key)
+ self.values = None
+ return defaults
+
+ def get_default_values(self):
+ """Overriding to make updating the defaults after instantiation of
+ the option parser possible, _update_defaults() does the dirty work."""
+ if not self.process_default_values:
+ # Old, pre-Optik 1.5 behaviour.
+ return optparse.Values(self.defaults)
+
+ # Load the configuration, or error out in case of an error
+ try:
+ self.config.load()
+ except ConfigurationError as err:
+ self.exit(UNKNOWN_ERROR, str(err))
+
+ defaults = self._update_defaults(self.defaults.copy()) # ours
+ for option in self._get_all_options():
+ default = defaults.get(option.dest)
+ if isinstance(default, str):
+ opt_str = option.get_opt_string()
+ defaults[option.dest] = option.check_value(opt_str, default)
+ return optparse.Values(defaults)
+
+ def error(self, msg):
+ self.print_usage(sys.stderr)
+ self.exit(UNKNOWN_ERROR, f"{msg}\n")
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/progress_bars.py b/.venv/lib/python3.9/site-packages/pip/_internal/cli/progress_bars.py
new file mode 100644
index 0000000..ac60d59
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/cli/progress_bars.py
@@ -0,0 +1,271 @@
+import itertools
+import sys
+from signal import SIGINT, default_int_handler, signal
+
+from pip._vendor.progress.bar import Bar, FillingCirclesBar, IncrementalBar
+from pip._vendor.progress.spinner import Spinner
+
+from pip._internal.utils.compat import WINDOWS
+from pip._internal.utils.logging import get_indentation
+from pip._internal.utils.misc import format_size
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from typing import Any, Dict, List
+
+try:
+ from pip._vendor import colorama
+# Lots of different errors can come from this, including SystemError and
+# ImportError.
+except Exception:
+ colorama = None
+
+
+def _select_progress_class(preferred, fallback):
+ # type: (Bar, Bar) -> Bar
+ encoding = getattr(preferred.file, "encoding", None)
+
+ # If we don't know what encoding this file is in, then we'll just assume
+ # that it doesn't support unicode and use the ASCII bar.
+ if not encoding:
+ return fallback
+
+ # Collect all of the possible characters we want to use with the preferred
+ # bar.
+ characters = [
+ getattr(preferred, "empty_fill", ""),
+ getattr(preferred, "fill", ""),
+ ]
+ characters += list(getattr(preferred, "phases", []))
+
+ # Try to decode the characters we're using for the bar using the encoding
+ # of the given file, if this works then we'll assume that we can use the
+ # fancier bar and if not we'll fall back to the plaintext bar.
+ try:
+ "".join(characters).encode(encoding)
+ except UnicodeEncodeError:
+ return fallback
+ else:
+ return preferred
+
+
+_BaseBar = _select_progress_class(IncrementalBar, Bar) # type: Any
+
+
+class InterruptibleMixin:
+ """
+ Helper to ensure that self.finish() gets called on keyboard interrupt.
+
+ This allows downloads to be interrupted without leaving temporary state
+ (like hidden cursors) behind.
+
+ This class is similar to the progress library's existing SigIntMixin
+ helper, but as of version 1.2, that helper has the following problems:
+
+ 1. It calls sys.exit().
+ 2. It discards the existing SIGINT handler completely.
+ 3. It leaves its own handler in place even after an uninterrupted finish,
+ which will have unexpected delayed effects if the user triggers an
+ unrelated keyboard interrupt some time after a progress-displaying
+ download has already completed, for example.
+ """
+
+ def __init__(self, *args, **kwargs):
+ # type: (List[Any], Dict[Any, Any]) -> None
+ """
+ Save the original SIGINT handler for later.
+ """
+ # https://github.com/python/mypy/issues/5887
+ super().__init__(*args, **kwargs) # type: ignore
+
+ self.original_handler = signal(SIGINT, self.handle_sigint)
+
+ # If signal() returns None, the previous handler was not installed from
+ # Python, and we cannot restore it. This probably should not happen,
+ # but if it does, we must restore something sensible instead, at least.
+ # The least bad option should be Python's default SIGINT handler, which
+ # just raises KeyboardInterrupt.
+ if self.original_handler is None:
+ self.original_handler = default_int_handler
+
+ def finish(self):
+ # type: () -> None
+ """
+ Restore the original SIGINT handler after finishing.
+
+ This should happen regardless of whether the progress display finishes
+ normally, or gets interrupted.
+ """
+ super().finish() # type: ignore
+ signal(SIGINT, self.original_handler)
+
+ def handle_sigint(self, signum, frame): # type: ignore
+ """
+ Call self.finish() before delegating to the original SIGINT handler.
+
+ This handler should only be in place while the progress display is
+ active.
+ """
+ self.finish()
+ self.original_handler(signum, frame)
+
+
+class SilentBar(Bar):
+
+ def update(self):
+ # type: () -> None
+ pass
+
+
+class BlueEmojiBar(IncrementalBar):
+
+ suffix = "%(percent)d%%"
+ bar_prefix = " "
+ bar_suffix = " "
+ phases = ("\U0001F539", "\U0001F537", "\U0001F535")
+
+
+class DownloadProgressMixin:
+
+ def __init__(self, *args, **kwargs):
+ # type: (List[Any], Dict[Any, Any]) -> None
+ # https://github.com/python/mypy/issues/5887
+ super().__init__(*args, **kwargs) # type: ignore
+ self.message = (" " * (
+ get_indentation() + 2
+ )) + self.message # type: str
+
+ @property
+ def downloaded(self):
+ # type: () -> str
+ return format_size(self.index) # type: ignore
+
+ @property
+ def download_speed(self):
+ # type: () -> str
+ # Avoid zero division errors...
+ if self.avg == 0.0: # type: ignore
+ return "..."
+ return format_size(1 / self.avg) + "/s" # type: ignore
+
+ @property
+ def pretty_eta(self):
+ # type: () -> str
+ if self.eta: # type: ignore
+ return f"eta {self.eta_td}" # type: ignore
+ return ""
+
+ def iter(self, it): # type: ignore
+ for x in it:
+ yield x
+ # B305 is incorrectly raised here
+ # https://github.com/PyCQA/flake8-bugbear/issues/59
+ self.next(len(x)) # noqa: B305
+ self.finish()
+
+
+class WindowsMixin:
+
+ def __init__(self, *args, **kwargs):
+ # type: (List[Any], Dict[Any, Any]) -> None
+ # The Windows terminal does not support the hide/show cursor ANSI codes
+ # even with colorama. So we'll ensure that hide_cursor is False on
+ # Windows.
+ # This call needs to go before the super() call, so that hide_cursor
+ # is set in time. The base progress bar class writes the "hide cursor"
+ # code to the terminal in its init, so if we don't set this soon
+ # enough, we get a "hide" with no corresponding "show"...
+ if WINDOWS and self.hide_cursor: # type: ignore
+ self.hide_cursor = False
+
+ # https://github.com/python/mypy/issues/5887
+ super().__init__(*args, **kwargs) # type: ignore
+
+ # Check if we are running on Windows and we have the colorama module,
+ # if we do then wrap our file with it.
+ if WINDOWS and colorama:
+ self.file = colorama.AnsiToWin32(self.file) # type: ignore
+ # The progress code expects to be able to call self.file.isatty()
+ # but the colorama.AnsiToWin32() object doesn't have that, so we'll
+ # add it.
+ self.file.isatty = lambda: self.file.wrapped.isatty()
+ # The progress code expects to be able to call self.file.flush()
+ # but the colorama.AnsiToWin32() object doesn't have that, so we'll
+ # add it.
+ self.file.flush = lambda: self.file.wrapped.flush()
+
+
+class BaseDownloadProgressBar(WindowsMixin, InterruptibleMixin,
+ DownloadProgressMixin):
+
+ file = sys.stdout
+ message = "%(percent)d%%"
+ suffix = "%(downloaded)s %(download_speed)s %(pretty_eta)s"
+
+
+class DefaultDownloadProgressBar(BaseDownloadProgressBar,
+ _BaseBar):
+ pass
+
+
+class DownloadSilentBar(BaseDownloadProgressBar, SilentBar):
+ pass
+
+
+class DownloadBar(BaseDownloadProgressBar,
+ Bar):
+ pass
+
+
+class DownloadFillingCirclesBar(BaseDownloadProgressBar,
+ FillingCirclesBar):
+ pass
+
+
+class DownloadBlueEmojiProgressBar(BaseDownloadProgressBar,
+ BlueEmojiBar):
+ pass
+
+
+class DownloadProgressSpinner(WindowsMixin, InterruptibleMixin,
+ DownloadProgressMixin, Spinner):
+
+ file = sys.stdout
+ suffix = "%(downloaded)s %(download_speed)s"
+
+ def next_phase(self):
+ # type: () -> str
+ if not hasattr(self, "_phaser"):
+ self._phaser = itertools.cycle(self.phases)
+ return next(self._phaser)
+
+ def update(self):
+ # type: () -> None
+ message = self.message % self
+ phase = self.next_phase()
+ suffix = self.suffix % self
+ line = ''.join([
+ message,
+ " " if message else "",
+ phase,
+ " " if suffix else "",
+ suffix,
+ ])
+
+ self.writeln(line)
+
+
+BAR_TYPES = {
+ "off": (DownloadSilentBar, DownloadSilentBar),
+ "on": (DefaultDownloadProgressBar, DownloadProgressSpinner),
+ "ascii": (DownloadBar, DownloadProgressSpinner),
+ "pretty": (DownloadFillingCirclesBar, DownloadProgressSpinner),
+ "emoji": (DownloadBlueEmojiProgressBar, DownloadProgressSpinner)
+}
+
+
+def DownloadProgressProvider(progress_bar, max=None): # type: ignore
+ if max is None or max == 0:
+ return BAR_TYPES[progress_bar][1]().iter
+ else:
+ return BAR_TYPES[progress_bar][0](max=max).iter
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/req_command.py b/.venv/lib/python3.9/site-packages/pip/_internal/cli/req_command.py
new file mode 100644
index 0000000..b7c5c4f
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/cli/req_command.py
@@ -0,0 +1,426 @@
+"""Contains the Command base classes that depend on PipSession.
+
+The classes in this module are in a separate module so the commands not
+needing download / PackageFinder capability don't unnecessarily import the
+PackageFinder machinery and all its vendored dependencies, etc.
+"""
+
+import logging
+import os
+from functools import partial
+
+from pip._internal.cli import cmdoptions
+from pip._internal.cli.base_command import Command
+from pip._internal.cli.command_context import CommandContextMixIn
+from pip._internal.exceptions import CommandError, PreviousBuildDirError
+from pip._internal.index.collector import LinkCollector
+from pip._internal.index.package_finder import PackageFinder
+from pip._internal.models.selection_prefs import SelectionPreferences
+from pip._internal.network.session import PipSession
+from pip._internal.operations.prepare import RequirementPreparer
+from pip._internal.req.constructors import (
+ install_req_from_editable,
+ install_req_from_line,
+ install_req_from_parsed_requirement,
+ install_req_from_req_string,
+)
+from pip._internal.req.req_file import parse_requirements
+from pip._internal.self_outdated_check import pip_self_version_check
+from pip._internal.utils.temp_dir import tempdir_kinds
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from optparse import Values
+ from typing import Any, List, Optional, Tuple
+
+ from pip._internal.cache import WheelCache
+ from pip._internal.models.target_python import TargetPython
+ from pip._internal.req.req_install import InstallRequirement
+ from pip._internal.req.req_tracker import RequirementTracker
+ from pip._internal.resolution.base import BaseResolver
+ from pip._internal.utils.temp_dir import TempDirectory, TempDirectoryTypeRegistry
+
+
+logger = logging.getLogger(__name__)
+
+
+class SessionCommandMixin(CommandContextMixIn):
+
+ """
+ A class mixin for command classes needing _build_session().
+ """
+ def __init__(self):
+ # type: () -> None
+ super().__init__()
+ self._session = None # Optional[PipSession]
+
+ @classmethod
+ def _get_index_urls(cls, options):
+ # type: (Values) -> Optional[List[str]]
+ """Return a list of index urls from user-provided options."""
+ index_urls = []
+ if not getattr(options, "no_index", False):
+ url = getattr(options, "index_url", None)
+ if url:
+ index_urls.append(url)
+ urls = getattr(options, "extra_index_urls", None)
+ if urls:
+ index_urls.extend(urls)
+ # Return None rather than an empty list
+ return index_urls or None
+
+ def get_default_session(self, options):
+ # type: (Values) -> PipSession
+ """Get a default-managed session."""
+ if self._session is None:
+ self._session = self.enter_context(self._build_session(options))
+ # there's no type annotation on requests.Session, so it's
+ # automatically ContextManager[Any] and self._session becomes Any,
+ # then https://github.com/python/mypy/issues/7696 kicks in
+ assert self._session is not None
+ return self._session
+
+ def _build_session(self, options, retries=None, timeout=None):
+ # type: (Values, Optional[int], Optional[int]) -> PipSession
+ assert not options.cache_dir or os.path.isabs(options.cache_dir)
+ session = PipSession(
+ cache=(
+ os.path.join(options.cache_dir, "http")
+ if options.cache_dir else None
+ ),
+ retries=retries if retries is not None else options.retries,
+ trusted_hosts=options.trusted_hosts,
+ index_urls=self._get_index_urls(options),
+ )
+
+ # Handle custom ca-bundles from the user
+ if options.cert:
+ session.verify = options.cert
+
+ # Handle SSL client certificate
+ if options.client_cert:
+ session.cert = options.client_cert
+
+ # Handle timeouts
+ if options.timeout or timeout:
+ session.timeout = (
+ timeout if timeout is not None else options.timeout
+ )
+
+ # Handle configured proxies
+ if options.proxy:
+ session.proxies = {
+ "http": options.proxy,
+ "https": options.proxy,
+ }
+
+ # Determine if we can prompt the user for authentication or not
+ session.auth.prompting = not options.no_input
+
+ return session
+
+
+class IndexGroupCommand(Command, SessionCommandMixin):
+
+ """
+ Abstract base class for commands with the index_group options.
+
+ This also corresponds to the commands that permit the pip version check.
+ """
+
+ def handle_pip_version_check(self, options):
+ # type: (Values) -> None
+ """
+ Do the pip version check if not disabled.
+
+ This overrides the default behavior of not doing the check.
+ """
+ # Make sure the index_group options are present.
+ assert hasattr(options, 'no_index')
+
+ if options.disable_pip_version_check or options.no_index:
+ return
+
+ # Otherwise, check if we're using the latest version of pip available.
+ session = self._build_session(
+ options,
+ retries=0,
+ timeout=min(5, options.timeout)
+ )
+ with session:
+ pip_self_version_check(session, options)
+
+
+KEEPABLE_TEMPDIR_TYPES = [
+ tempdir_kinds.BUILD_ENV,
+ tempdir_kinds.EPHEM_WHEEL_CACHE,
+ tempdir_kinds.REQ_BUILD,
+]
+
+
+def with_cleanup(func):
+ # type: (Any) -> Any
+ """Decorator for common logic related to managing temporary
+ directories.
+ """
+ def configure_tempdir_registry(registry):
+ # type: (TempDirectoryTypeRegistry) -> None
+ for t in KEEPABLE_TEMPDIR_TYPES:
+ registry.set_delete(t, False)
+
+ def wrapper(self, options, args):
+ # type: (RequirementCommand, Values, List[Any]) -> Optional[int]
+ assert self.tempdir_registry is not None
+ if options.no_clean:
+ configure_tempdir_registry(self.tempdir_registry)
+
+ try:
+ return func(self, options, args)
+ except PreviousBuildDirError:
+ # This kind of conflict can occur when the user passes an explicit
+ # build directory with a pre-existing folder. In that case we do
+ # not want to accidentally remove it.
+ configure_tempdir_registry(self.tempdir_registry)
+ raise
+
+ return wrapper
+
+
+class RequirementCommand(IndexGroupCommand):
+
+ def __init__(self, *args, **kw):
+ # type: (Any, Any) -> None
+ super().__init__(*args, **kw)
+
+ self.cmd_opts.add_option(cmdoptions.no_clean())
+
+ @staticmethod
+ def determine_resolver_variant(options):
+ # type: (Values) -> str
+ """Determines which resolver should be used, based on the given options."""
+ if "legacy-resolver" in options.deprecated_features_enabled:
+ return "legacy"
+
+ return "2020-resolver"
+
+ @classmethod
+ def make_requirement_preparer(
+ cls,
+ temp_build_dir, # type: TempDirectory
+ options, # type: Values
+ req_tracker, # type: RequirementTracker
+ session, # type: PipSession
+ finder, # type: PackageFinder
+ use_user_site, # type: bool
+ download_dir=None, # type: str
+ ):
+ # type: (...) -> RequirementPreparer
+ """
+ Create a RequirementPreparer instance for the given parameters.
+ """
+ temp_build_dir_path = temp_build_dir.path
+ assert temp_build_dir_path is not None
+
+ resolver_variant = cls.determine_resolver_variant(options)
+ if resolver_variant == "2020-resolver":
+ lazy_wheel = 'fast-deps' in options.features_enabled
+ if lazy_wheel:
+ logger.warning(
+ 'pip is using lazily downloaded wheels using HTTP '
+ 'range requests to obtain dependency information. '
+ 'This experimental feature is enabled through '
+ '--use-feature=fast-deps and it is not ready for '
+ 'production.'
+ )
+ else:
+ lazy_wheel = False
+ if 'fast-deps' in options.features_enabled:
+ logger.warning(
+ 'fast-deps has no effect when used with the legacy resolver.'
+ )
+
+ return RequirementPreparer(
+ build_dir=temp_build_dir_path,
+ src_dir=options.src_dir,
+ download_dir=download_dir,
+ build_isolation=options.build_isolation,
+ req_tracker=req_tracker,
+ session=session,
+ progress_bar=options.progress_bar,
+ finder=finder,
+ require_hashes=options.require_hashes,
+ use_user_site=use_user_site,
+ lazy_wheel=lazy_wheel,
+ )
+
+ @classmethod
+ def make_resolver(
+ cls,
+ preparer, # type: RequirementPreparer
+ finder, # type: PackageFinder
+ options, # type: Values
+ wheel_cache=None, # type: Optional[WheelCache]
+ use_user_site=False, # type: bool
+ ignore_installed=True, # type: bool
+ ignore_requires_python=False, # type: bool
+ force_reinstall=False, # type: bool
+ upgrade_strategy="to-satisfy-only", # type: str
+ use_pep517=None, # type: Optional[bool]
+ py_version_info=None, # type: Optional[Tuple[int, ...]]
+ ):
+ # type: (...) -> BaseResolver
+ """
+ Create a Resolver instance for the given parameters.
+ """
+ make_install_req = partial(
+ install_req_from_req_string,
+ isolated=options.isolated_mode,
+ use_pep517=use_pep517,
+ )
+ resolver_variant = cls.determine_resolver_variant(options)
+ # The long import name and duplicated invocation is needed to convince
+ # Mypy into correctly typechecking. Otherwise it would complain the
+ # "Resolver" class being redefined.
+ if resolver_variant == "2020-resolver":
+ import pip._internal.resolution.resolvelib.resolver
+
+ return pip._internal.resolution.resolvelib.resolver.Resolver(
+ preparer=preparer,
+ finder=finder,
+ wheel_cache=wheel_cache,
+ make_install_req=make_install_req,
+ use_user_site=use_user_site,
+ ignore_dependencies=options.ignore_dependencies,
+ ignore_installed=ignore_installed,
+ ignore_requires_python=ignore_requires_python,
+ force_reinstall=force_reinstall,
+ upgrade_strategy=upgrade_strategy,
+ py_version_info=py_version_info,
+ )
+ import pip._internal.resolution.legacy.resolver
+ return pip._internal.resolution.legacy.resolver.Resolver(
+ preparer=preparer,
+ finder=finder,
+ wheel_cache=wheel_cache,
+ make_install_req=make_install_req,
+ use_user_site=use_user_site,
+ ignore_dependencies=options.ignore_dependencies,
+ ignore_installed=ignore_installed,
+ ignore_requires_python=ignore_requires_python,
+ force_reinstall=force_reinstall,
+ upgrade_strategy=upgrade_strategy,
+ py_version_info=py_version_info,
+ )
+
+ def get_requirements(
+ self,
+ args, # type: List[str]
+ options, # type: Values
+ finder, # type: PackageFinder
+ session, # type: PipSession
+ ):
+ # type: (...) -> List[InstallRequirement]
+ """
+ Parse command-line arguments into the corresponding requirements.
+ """
+ requirements = [] # type: List[InstallRequirement]
+ for filename in options.constraints:
+ for parsed_req in parse_requirements(
+ filename,
+ constraint=True, finder=finder, options=options,
+ session=session):
+ req_to_add = install_req_from_parsed_requirement(
+ parsed_req,
+ isolated=options.isolated_mode,
+ user_supplied=False,
+ )
+ requirements.append(req_to_add)
+
+ for req in args:
+ req_to_add = install_req_from_line(
+ req, None, isolated=options.isolated_mode,
+ use_pep517=options.use_pep517,
+ user_supplied=True,
+ )
+ requirements.append(req_to_add)
+
+ for req in options.editables:
+ req_to_add = install_req_from_editable(
+ req,
+ user_supplied=True,
+ isolated=options.isolated_mode,
+ use_pep517=options.use_pep517,
+ )
+ requirements.append(req_to_add)
+
+ # NOTE: options.require_hashes may be set if --require-hashes is True
+ for filename in options.requirements:
+ for parsed_req in parse_requirements(
+ filename,
+ finder=finder, options=options, session=session):
+ req_to_add = install_req_from_parsed_requirement(
+ parsed_req,
+ isolated=options.isolated_mode,
+ use_pep517=options.use_pep517,
+ user_supplied=True,
+ )
+ requirements.append(req_to_add)
+
+ # If any requirement has hash options, enable hash checking.
+ if any(req.has_hash_options for req in requirements):
+ options.require_hashes = True
+
+ if not (args or options.editables or options.requirements):
+ opts = {'name': self.name}
+ if options.find_links:
+ raise CommandError(
+ 'You must give at least one requirement to {name} '
+ '(maybe you meant "pip {name} {links}"?)'.format(
+ **dict(opts, links=' '.join(options.find_links))))
+ else:
+ raise CommandError(
+ 'You must give at least one requirement to {name} '
+ '(see "pip help {name}")'.format(**opts))
+
+ return requirements
+
+ @staticmethod
+ def trace_basic_info(finder):
+ # type: (PackageFinder) -> None
+ """
+ Trace basic information about the provided objects.
+ """
+ # Display where finder is looking for packages
+ search_scope = finder.search_scope
+ locations = search_scope.get_formatted_locations()
+ if locations:
+ logger.info(locations)
+
+ def _build_package_finder(
+ self,
+ options, # type: Values
+ session, # type: PipSession
+ target_python=None, # type: Optional[TargetPython]
+ ignore_requires_python=None, # type: Optional[bool]
+ ):
+ # type: (...) -> PackageFinder
+ """
+ Create a package finder appropriate to this requirement command.
+
+ :param ignore_requires_python: Whether to ignore incompatible
+ "Requires-Python" values in links. Defaults to False.
+ """
+ link_collector = LinkCollector.create(session, options=options)
+ selection_prefs = SelectionPreferences(
+ allow_yanked=True,
+ format_control=options.format_control,
+ allow_all_prereleases=options.pre,
+ prefer_binary=options.prefer_binary,
+ ignore_requires_python=ignore_requires_python,
+ )
+
+ return PackageFinder.create(
+ link_collector=link_collector,
+ selection_prefs=selection_prefs,
+ target_python=target_python,
+ )
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/spinners.py b/.venv/lib/python3.9/site-packages/pip/_internal/cli/spinners.py
new file mode 100644
index 0000000..ece7a92
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/cli/spinners.py
@@ -0,0 +1,171 @@
+import contextlib
+import itertools
+import logging
+import sys
+import time
+
+from pip._vendor.progress import HIDE_CURSOR, SHOW_CURSOR
+
+from pip._internal.utils.compat import WINDOWS
+from pip._internal.utils.logging import get_indentation
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from typing import IO, Iterator
+
+logger = logging.getLogger(__name__)
+
+
+class SpinnerInterface:
+ def spin(self):
+ # type: () -> None
+ raise NotImplementedError()
+
+ def finish(self, final_status):
+ # type: (str) -> None
+ raise NotImplementedError()
+
+
+class InteractiveSpinner(SpinnerInterface):
+ def __init__(self, message, file=None, spin_chars="-\\|/",
+ # Empirically, 8 updates/second looks nice
+ min_update_interval_seconds=0.125):
+ # type: (str, IO[str], str, float) -> None
+ self._message = message
+ if file is None:
+ file = sys.stdout
+ self._file = file
+ self._rate_limiter = RateLimiter(min_update_interval_seconds)
+ self._finished = False
+
+ self._spin_cycle = itertools.cycle(spin_chars)
+
+ self._file.write(" " * get_indentation() + self._message + " ... ")
+ self._width = 0
+
+ def _write(self, status):
+ # type: (str) -> None
+ assert not self._finished
+ # Erase what we wrote before by backspacing to the beginning, writing
+ # spaces to overwrite the old text, and then backspacing again
+ backup = "\b" * self._width
+ self._file.write(backup + " " * self._width + backup)
+ # Now we have a blank slate to add our status
+ self._file.write(status)
+ self._width = len(status)
+ self._file.flush()
+ self._rate_limiter.reset()
+
+ def spin(self):
+ # type: () -> None
+ if self._finished:
+ return
+ if not self._rate_limiter.ready():
+ return
+ self._write(next(self._spin_cycle))
+
+ def finish(self, final_status):
+ # type: (str) -> None
+ if self._finished:
+ return
+ self._write(final_status)
+ self._file.write("\n")
+ self._file.flush()
+ self._finished = True
+
+
+# Used for dumb terminals, non-interactive installs (no tty), etc.
+# We still print updates occasionally (once every 60 seconds by default) to
+# act as a keep-alive for systems like Travis-CI that take lack-of-output as
+# an indication that a task has frozen.
+class NonInteractiveSpinner(SpinnerInterface):
+ def __init__(self, message, min_update_interval_seconds=60):
+ # type: (str, float) -> None
+ self._message = message
+ self._finished = False
+ self._rate_limiter = RateLimiter(min_update_interval_seconds)
+ self._update("started")
+
+ def _update(self, status):
+ # type: (str) -> None
+ assert not self._finished
+ self._rate_limiter.reset()
+ logger.info("%s: %s", self._message, status)
+
+ def spin(self):
+ # type: () -> None
+ if self._finished:
+ return
+ if not self._rate_limiter.ready():
+ return
+ self._update("still running...")
+
+ def finish(self, final_status):
+ # type: (str) -> None
+ if self._finished:
+ return
+ self._update(
+ "finished with status '{final_status}'".format(**locals()))
+ self._finished = True
+
+
+class RateLimiter:
+ def __init__(self, min_update_interval_seconds):
+ # type: (float) -> None
+ self._min_update_interval_seconds = min_update_interval_seconds
+ self._last_update = 0 # type: float
+
+ def ready(self):
+ # type: () -> bool
+ now = time.time()
+ delta = now - self._last_update
+ return delta >= self._min_update_interval_seconds
+
+ def reset(self):
+ # type: () -> None
+ self._last_update = time.time()
+
+
+@contextlib.contextmanager
+def open_spinner(message):
+ # type: (str) -> Iterator[SpinnerInterface]
+ # Interactive spinner goes directly to sys.stdout rather than being routed
+ # through the logging system, but it acts like it has level INFO,
+ # i.e. it's only displayed if we're at level INFO or better.
+ # Non-interactive spinner goes through the logging system, so it is always
+ # in sync with logging configuration.
+ if sys.stdout.isatty() and logger.getEffectiveLevel() <= logging.INFO:
+ spinner = InteractiveSpinner(message) # type: SpinnerInterface
+ else:
+ spinner = NonInteractiveSpinner(message)
+ try:
+ with hidden_cursor(sys.stdout):
+ yield spinner
+ except KeyboardInterrupt:
+ spinner.finish("canceled")
+ raise
+ except Exception:
+ spinner.finish("error")
+ raise
+ else:
+ spinner.finish("done")
+
+
+@contextlib.contextmanager
+def hidden_cursor(file):
+ # type: (IO[str]) -> Iterator[None]
+ # The Windows terminal does not support the hide/show cursor ANSI codes,
+ # even via colorama. So don't even try.
+ if WINDOWS:
+ yield
+ # We don't want to clutter the output with control characters if we're
+ # writing to a file, or if the user is running with --quiet.
+ # See https://github.com/pypa/pip/issues/3418
+ elif not file.isatty() or logger.getEffectiveLevel() > logging.INFO:
+ yield
+ else:
+ file.write(HIDE_CURSOR)
+ try:
+ yield
+ finally:
+ file.write(SHOW_CURSOR)
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/cli/status_codes.py b/.venv/lib/python3.9/site-packages/pip/_internal/cli/status_codes.py
new file mode 100644
index 0000000..cd93391
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/cli/status_codes.py
@@ -0,0 +1,6 @@
+SUCCESS = 0
+ERROR = 1
+UNKNOWN_ERROR = 2
+VIRTUALENV_NOT_FOUND = 3
+PREVIOUS_BUILD_DIR_ERROR = 4
+NO_MATCHES_FOUND = 23
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__init__.py b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__init__.py
new file mode 100644
index 0000000..315b5dd
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__init__.py
@@ -0,0 +1,115 @@
+"""
+Package containing all pip commands
+"""
+
+import importlib
+from collections import OrderedDict, namedtuple
+
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from typing import Any, Optional
+
+ from pip._internal.cli.base_command import Command
+
+
+CommandInfo = namedtuple('CommandInfo', 'module_path, class_name, summary')
+
+# The ordering matters for help display.
+# Also, even though the module path starts with the same
+# "pip._internal.commands" prefix in each case, we include the full path
+# because it makes testing easier (specifically when modifying commands_dict
+# in test setup / teardown by adding info for a FakeCommand class defined
+# in a test-related module).
+# Finally, we need to pass an iterable of pairs here rather than a dict
+# so that the ordering won't be lost when using Python 2.7.
+commands_dict = OrderedDict([
+ ('install', CommandInfo(
+ 'pip._internal.commands.install', 'InstallCommand',
+ 'Install packages.',
+ )),
+ ('download', CommandInfo(
+ 'pip._internal.commands.download', 'DownloadCommand',
+ 'Download packages.',
+ )),
+ ('uninstall', CommandInfo(
+ 'pip._internal.commands.uninstall', 'UninstallCommand',
+ 'Uninstall packages.',
+ )),
+ ('freeze', CommandInfo(
+ 'pip._internal.commands.freeze', 'FreezeCommand',
+ 'Output installed packages in requirements format.',
+ )),
+ ('list', CommandInfo(
+ 'pip._internal.commands.list', 'ListCommand',
+ 'List installed packages.',
+ )),
+ ('show', CommandInfo(
+ 'pip._internal.commands.show', 'ShowCommand',
+ 'Show information about installed packages.',
+ )),
+ ('check', CommandInfo(
+ 'pip._internal.commands.check', 'CheckCommand',
+ 'Verify installed packages have compatible dependencies.',
+ )),
+ ('config', CommandInfo(
+ 'pip._internal.commands.configuration', 'ConfigurationCommand',
+ 'Manage local and global configuration.',
+ )),
+ ('search', CommandInfo(
+ 'pip._internal.commands.search', 'SearchCommand',
+ 'Search PyPI for packages.',
+ )),
+ ('cache', CommandInfo(
+ 'pip._internal.commands.cache', 'CacheCommand',
+ "Inspect and manage pip's wheel cache.",
+ )),
+ ('wheel', CommandInfo(
+ 'pip._internal.commands.wheel', 'WheelCommand',
+ 'Build wheels from your requirements.',
+ )),
+ ('hash', CommandInfo(
+ 'pip._internal.commands.hash', 'HashCommand',
+ 'Compute hashes of package archives.',
+ )),
+ ('completion', CommandInfo(
+ 'pip._internal.commands.completion', 'CompletionCommand',
+ 'A helper command used for command completion.',
+ )),
+ ('debug', CommandInfo(
+ 'pip._internal.commands.debug', 'DebugCommand',
+ 'Show information useful for debugging.',
+ )),
+ ('help', CommandInfo(
+ 'pip._internal.commands.help', 'HelpCommand',
+ 'Show help for commands.',
+ )),
+]) # type: OrderedDict[str, CommandInfo]
+
+
+def create_command(name, **kwargs):
+ # type: (str, **Any) -> Command
+ """
+ Create an instance of the Command class with the given name.
+ """
+ module_path, class_name, summary = commands_dict[name]
+ module = importlib.import_module(module_path)
+ command_class = getattr(module, class_name)
+ command = command_class(name=name, summary=summary, **kwargs)
+
+ return command
+
+
+def get_similar_commands(name):
+ # type: (str) -> Optional[str]
+ """Command name auto-correct."""
+ from difflib import get_close_matches
+
+ name = name.lower()
+
+ close_commands = get_close_matches(name, commands_dict.keys())
+
+ if close_commands:
+ return close_commands[0]
+ else:
+ return None
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/__init__.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/__init__.cpython-39.pyc
new file mode 100644
index 0000000..f261853
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/__init__.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/cache.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/cache.cpython-39.pyc
new file mode 100644
index 0000000..65d75fa
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/cache.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/check.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/check.cpython-39.pyc
new file mode 100644
index 0000000..8a4da61
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/check.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/completion.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/completion.cpython-39.pyc
new file mode 100644
index 0000000..ce15758
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/completion.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/configuration.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/configuration.cpython-39.pyc
new file mode 100644
index 0000000..1fa30cf
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/configuration.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/debug.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/debug.cpython-39.pyc
new file mode 100644
index 0000000..e0a9f43
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/debug.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/download.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/download.cpython-39.pyc
new file mode 100644
index 0000000..b59187b
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/download.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/freeze.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/freeze.cpython-39.pyc
new file mode 100644
index 0000000..6aa191e
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/freeze.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/hash.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/hash.cpython-39.pyc
new file mode 100644
index 0000000..511dc6e
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/hash.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/help.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/help.cpython-39.pyc
new file mode 100644
index 0000000..fdaf56f
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/help.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/install.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/install.cpython-39.pyc
new file mode 100644
index 0000000..d21425a
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/install.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/list.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/list.cpython-39.pyc
new file mode 100644
index 0000000..0cbb85e
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/list.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/search.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/search.cpython-39.pyc
new file mode 100644
index 0000000..4c75845
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/search.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/show.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/show.cpython-39.pyc
new file mode 100644
index 0000000..ac83dab
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/show.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/uninstall.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/uninstall.cpython-39.pyc
new file mode 100644
index 0000000..d0e2f67
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/uninstall.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/wheel.cpython-39.pyc b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/wheel.cpython-39.pyc
new file mode 100644
index 0000000..8a476cb
Binary files /dev/null and b/.venv/lib/python3.9/site-packages/pip/_internal/commands/__pycache__/wheel.cpython-39.pyc differ
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/cache.py b/.venv/lib/python3.9/site-packages/pip/_internal/commands/cache.py
new file mode 100644
index 0000000..d2f7ae0
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/commands/cache.py
@@ -0,0 +1,232 @@
+import logging
+import os
+import textwrap
+
+import pip._internal.utils.filesystem as filesystem
+from pip._internal.cli.base_command import Command
+from pip._internal.cli.status_codes import ERROR, SUCCESS
+from pip._internal.exceptions import CommandError, PipError
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from optparse import Values
+ from typing import Any, List
+
+
+logger = logging.getLogger(__name__)
+
+
+class CacheCommand(Command):
+ """
+ Inspect and manage pip's wheel cache.
+
+ Subcommands:
+
+ - dir: Show the cache directory.
+ - info: Show information about the cache.
+ - list: List filenames of packages stored in the cache.
+ - remove: Remove one or more package from the cache.
+ - purge: Remove all items from the cache.
+
+ ```` can be a glob expression or a package name.
+ """
+
+ ignore_require_venv = True
+ usage = """
+ %prog dir
+ %prog info
+ %prog list [] [--format=[human, abspath]]
+ %prog remove
+ %prog purge
+ """
+
+ def add_options(self):
+ # type: () -> None
+
+ self.cmd_opts.add_option(
+ '--format',
+ action='store',
+ dest='list_format',
+ default="human",
+ choices=('human', 'abspath'),
+ help="Select the output format among: human (default) or abspath"
+ )
+
+ self.parser.insert_option_group(0, self.cmd_opts)
+
+ def run(self, options, args):
+ # type: (Values, List[Any]) -> int
+ handlers = {
+ "dir": self.get_cache_dir,
+ "info": self.get_cache_info,
+ "list": self.list_cache_items,
+ "remove": self.remove_cache_items,
+ "purge": self.purge_cache,
+ }
+
+ if not options.cache_dir:
+ logger.error("pip cache commands can not "
+ "function since cache is disabled.")
+ return ERROR
+
+ # Determine action
+ if not args or args[0] not in handlers:
+ logger.error(
+ "Need an action (%s) to perform.",
+ ", ".join(sorted(handlers)),
+ )
+ return ERROR
+
+ action = args[0]
+
+ # Error handling happens here, not in the action-handlers.
+ try:
+ handlers[action](options, args[1:])
+ except PipError as e:
+ logger.error(e.args[0])
+ return ERROR
+
+ return SUCCESS
+
+ def get_cache_dir(self, options, args):
+ # type: (Values, List[Any]) -> None
+ if args:
+ raise CommandError('Too many arguments')
+
+ logger.info(options.cache_dir)
+
+ def get_cache_info(self, options, args):
+ # type: (Values, List[Any]) -> None
+ if args:
+ raise CommandError('Too many arguments')
+
+ num_http_files = len(self._find_http_files(options))
+ num_packages = len(self._find_wheels(options, '*'))
+
+ http_cache_location = self._cache_dir(options, 'http')
+ wheels_cache_location = self._cache_dir(options, 'wheels')
+ http_cache_size = filesystem.format_directory_size(http_cache_location)
+ wheels_cache_size = filesystem.format_directory_size(
+ wheels_cache_location
+ )
+
+ message = textwrap.dedent("""
+ Package index page cache location: {http_cache_location}
+ Package index page cache size: {http_cache_size}
+ Number of HTTP files: {num_http_files}
+ Wheels location: {wheels_cache_location}
+ Wheels size: {wheels_cache_size}
+ Number of wheels: {package_count}
+ """).format(
+ http_cache_location=http_cache_location,
+ http_cache_size=http_cache_size,
+ num_http_files=num_http_files,
+ wheels_cache_location=wheels_cache_location,
+ package_count=num_packages,
+ wheels_cache_size=wheels_cache_size,
+ ).strip()
+
+ logger.info(message)
+
+ def list_cache_items(self, options, args):
+ # type: (Values, List[Any]) -> None
+ if len(args) > 1:
+ raise CommandError('Too many arguments')
+
+ if args:
+ pattern = args[0]
+ else:
+ pattern = '*'
+
+ files = self._find_wheels(options, pattern)
+ if options.list_format == 'human':
+ self.format_for_human(files)
+ else:
+ self.format_for_abspath(files)
+
+ def format_for_human(self, files):
+ # type: (List[str]) -> None
+ if not files:
+ logger.info('Nothing cached.')
+ return
+
+ results = []
+ for filename in files:
+ wheel = os.path.basename(filename)
+ size = filesystem.format_file_size(filename)
+ results.append(f' - {wheel} ({size})')
+ logger.info('Cache contents:\n')
+ logger.info('\n'.join(sorted(results)))
+
+ def format_for_abspath(self, files):
+ # type: (List[str]) -> None
+ if not files:
+ return
+
+ results = []
+ for filename in files:
+ results.append(filename)
+
+ logger.info('\n'.join(sorted(results)))
+
+ def remove_cache_items(self, options, args):
+ # type: (Values, List[Any]) -> None
+ if len(args) > 1:
+ raise CommandError('Too many arguments')
+
+ if not args:
+ raise CommandError('Please provide a pattern')
+
+ files = self._find_wheels(options, args[0])
+
+ # Only fetch http files if no specific pattern given
+ if args[0] == '*':
+ files += self._find_http_files(options)
+
+ if not files:
+ raise CommandError('No matching packages')
+
+ for filename in files:
+ os.unlink(filename)
+ logger.debug('Removed %s', filename)
+ logger.info('Files removed: %s', len(files))
+
+ def purge_cache(self, options, args):
+ # type: (Values, List[Any]) -> None
+ if args:
+ raise CommandError('Too many arguments')
+
+ return self.remove_cache_items(options, ['*'])
+
+ def _cache_dir(self, options, subdir):
+ # type: (Values, str) -> str
+ return os.path.join(options.cache_dir, subdir)
+
+ def _find_http_files(self, options):
+ # type: (Values) -> List[str]
+ http_dir = self._cache_dir(options, 'http')
+ return filesystem.find_files(http_dir, '*')
+
+ def _find_wheels(self, options, pattern):
+ # type: (Values, str) -> List[str]
+ wheel_dir = self._cache_dir(options, 'wheels')
+
+ # The wheel filename format, as specified in PEP 427, is:
+ # {distribution}-{version}(-{build})?-{python}-{abi}-{platform}.whl
+ #
+ # Additionally, non-alphanumeric values in the distribution are
+ # normalized to underscores (_), meaning hyphens can never occur
+ # before `-{version}`.
+ #
+ # Given that information:
+ # - If the pattern we're given contains a hyphen (-), the user is
+ # providing at least the version. Thus, we can just append `*.whl`
+ # to match the rest of it.
+ # - If the pattern we're given doesn't contain a hyphen (-), the
+ # user is only providing the name. Thus, we append `-*.whl` to
+ # match the hyphen before the version, followed by anything else.
+ #
+ # PEP 427: https://www.python.org/dev/peps/pep-0427/
+ pattern = pattern + ("*.whl" if "-" in pattern else "-*.whl")
+
+ return filesystem.find_files(wheel_dir, pattern)
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/check.py b/.venv/lib/python3.9/site-packages/pip/_internal/commands/check.py
new file mode 100644
index 0000000..d938da5
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/commands/check.py
@@ -0,0 +1,51 @@
+import logging
+
+from pip._internal.cli.base_command import Command
+from pip._internal.cli.status_codes import ERROR, SUCCESS
+from pip._internal.operations.check import (
+ check_package_set,
+ create_package_set_from_installed,
+)
+from pip._internal.utils.misc import write_output
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+logger = logging.getLogger(__name__)
+
+if MYPY_CHECK_RUNNING:
+ from optparse import Values
+ from typing import Any, List
+
+
+class CheckCommand(Command):
+ """Verify installed packages have compatible dependencies."""
+
+ usage = """
+ %prog [options]"""
+
+ def run(self, options, args):
+ # type: (Values, List[Any]) -> int
+
+ package_set, parsing_probs = create_package_set_from_installed()
+ missing, conflicting = check_package_set(package_set)
+
+ for project_name in missing:
+ version = package_set[project_name].version
+ for dependency in missing[project_name]:
+ write_output(
+ "%s %s requires %s, which is not installed.",
+ project_name, version, dependency[0],
+ )
+
+ for project_name in conflicting:
+ version = package_set[project_name].version
+ for dep_name, dep_version, req in conflicting[project_name]:
+ write_output(
+ "%s %s has requirement %s, but you have %s %s.",
+ project_name, version, req, dep_name, dep_version,
+ )
+
+ if missing or conflicting or parsing_probs:
+ return ERROR
+ else:
+ write_output("No broken requirements found.")
+ return SUCCESS
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/completion.py b/.venv/lib/python3.9/site-packages/pip/_internal/commands/completion.py
new file mode 100644
index 0000000..7b690fa
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/commands/completion.py
@@ -0,0 +1,96 @@
+import sys
+import textwrap
+
+from pip._internal.cli.base_command import Command
+from pip._internal.cli.status_codes import SUCCESS
+from pip._internal.utils.misc import get_prog
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from optparse import Values
+ from typing import List
+
+BASE_COMPLETION = """
+# pip {shell} completion start{script}# pip {shell} completion end
+"""
+
+COMPLETION_SCRIPTS = {
+ 'bash': """
+ _pip_completion()
+ {{
+ COMPREPLY=( $( COMP_WORDS="${{COMP_WORDS[*]}}" \\
+ COMP_CWORD=$COMP_CWORD \\
+ PIP_AUTO_COMPLETE=1 $1 2>/dev/null ) )
+ }}
+ complete -o default -F _pip_completion {prog}
+ """,
+ 'zsh': """
+ function _pip_completion {{
+ local words cword
+ read -Ac words
+ read -cn cword
+ reply=( $( COMP_WORDS="$words[*]" \\
+ COMP_CWORD=$(( cword-1 )) \\
+ PIP_AUTO_COMPLETE=1 $words[1] 2>/dev/null ))
+ }}
+ compctl -K _pip_completion {prog}
+ """,
+ 'fish': """
+ function __fish_complete_pip
+ set -lx COMP_WORDS (commandline -o) ""
+ set -lx COMP_CWORD ( \\
+ math (contains -i -- (commandline -t) $COMP_WORDS)-1 \\
+ )
+ set -lx PIP_AUTO_COMPLETE 1
+ string split \\ -- (eval $COMP_WORDS[1])
+ end
+ complete -fa "(__fish_complete_pip)" -c {prog}
+ """,
+}
+
+
+class CompletionCommand(Command):
+ """A helper command to be used for command completion."""
+
+ ignore_require_venv = True
+
+ def add_options(self):
+ # type: () -> None
+ self.cmd_opts.add_option(
+ '--bash', '-b',
+ action='store_const',
+ const='bash',
+ dest='shell',
+ help='Emit completion code for bash')
+ self.cmd_opts.add_option(
+ '--zsh', '-z',
+ action='store_const',
+ const='zsh',
+ dest='shell',
+ help='Emit completion code for zsh')
+ self.cmd_opts.add_option(
+ '--fish', '-f',
+ action='store_const',
+ const='fish',
+ dest='shell',
+ help='Emit completion code for fish')
+
+ self.parser.insert_option_group(0, self.cmd_opts)
+
+ def run(self, options, args):
+ # type: (Values, List[str]) -> int
+ """Prints the completion code of the given shell"""
+ shells = COMPLETION_SCRIPTS.keys()
+ shell_options = ['--' + shell for shell in sorted(shells)]
+ if options.shell in shells:
+ script = textwrap.dedent(
+ COMPLETION_SCRIPTS.get(options.shell, '').format(
+ prog=get_prog())
+ )
+ print(BASE_COMPLETION.format(script=script, shell=options.shell))
+ return SUCCESS
+ else:
+ sys.stderr.write(
+ 'ERROR: You must pass {}\n' .format(' or '.join(shell_options))
+ )
+ return SUCCESS
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/configuration.py b/.venv/lib/python3.9/site-packages/pip/_internal/commands/configuration.py
new file mode 100644
index 0000000..ad59f02
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/commands/configuration.py
@@ -0,0 +1,280 @@
+import logging
+import os
+import subprocess
+
+from pip._internal.cli.base_command import Command
+from pip._internal.cli.status_codes import ERROR, SUCCESS
+from pip._internal.configuration import Configuration, get_configuration_files, kinds
+from pip._internal.exceptions import PipError
+from pip._internal.utils.logging import indent_log
+from pip._internal.utils.misc import get_prog, write_output
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from optparse import Values
+ from typing import Any, List, Optional
+
+ from pip._internal.configuration import Kind
+
+logger = logging.getLogger(__name__)
+
+
+class ConfigurationCommand(Command):
+ """
+ Manage local and global configuration.
+
+ Subcommands:
+
+ - list: List the active configuration (or from the file specified)
+ - edit: Edit the configuration file in an editor
+ - get: Get the value associated with name
+ - set: Set the name=value
+ - unset: Unset the value associated with name
+ - debug: List the configuration files and values defined under them
+
+ If none of --user, --global and --site are passed, a virtual
+ environment configuration file is used if one is active and the file
+ exists. Otherwise, all modifications happen on the to the user file by
+ default.
+ """
+
+ ignore_require_venv = True
+ usage = """
+ %prog [] list
+ %prog [] [--editor ] edit
+
+ %prog [] get name
+ %prog [] set name value
+ %prog [] unset name
+ %prog [] debug
+ """
+
+ def add_options(self):
+ # type: () -> None
+ self.cmd_opts.add_option(
+ '--editor',
+ dest='editor',
+ action='store',
+ default=None,
+ help=(
+ 'Editor to use to edit the file. Uses VISUAL or EDITOR '
+ 'environment variables if not provided.'
+ )
+ )
+
+ self.cmd_opts.add_option(
+ '--global',
+ dest='global_file',
+ action='store_true',
+ default=False,
+ help='Use the system-wide configuration file only'
+ )
+
+ self.cmd_opts.add_option(
+ '--user',
+ dest='user_file',
+ action='store_true',
+ default=False,
+ help='Use the user configuration file only'
+ )
+
+ self.cmd_opts.add_option(
+ '--site',
+ dest='site_file',
+ action='store_true',
+ default=False,
+ help='Use the current environment configuration file only'
+ )
+
+ self.parser.insert_option_group(0, self.cmd_opts)
+
+ def run(self, options, args):
+ # type: (Values, List[str]) -> int
+ handlers = {
+ "list": self.list_values,
+ "edit": self.open_in_editor,
+ "get": self.get_name,
+ "set": self.set_name_value,
+ "unset": self.unset_name,
+ "debug": self.list_config_values,
+ }
+
+ # Determine action
+ if not args or args[0] not in handlers:
+ logger.error(
+ "Need an action (%s) to perform.",
+ ", ".join(sorted(handlers)),
+ )
+ return ERROR
+
+ action = args[0]
+
+ # Determine which configuration files are to be loaded
+ # Depends on whether the command is modifying.
+ try:
+ load_only = self._determine_file(
+ options, need_value=(action in ["get", "set", "unset", "edit"])
+ )
+ except PipError as e:
+ logger.error(e.args[0])
+ return ERROR
+
+ # Load a new configuration
+ self.configuration = Configuration(
+ isolated=options.isolated_mode, load_only=load_only
+ )
+ self.configuration.load()
+
+ # Error handling happens here, not in the action-handlers.
+ try:
+ handlers[action](options, args[1:])
+ except PipError as e:
+ logger.error(e.args[0])
+ return ERROR
+
+ return SUCCESS
+
+ def _determine_file(self, options, need_value):
+ # type: (Values, bool) -> Optional[Kind]
+ file_options = [key for key, value in (
+ (kinds.USER, options.user_file),
+ (kinds.GLOBAL, options.global_file),
+ (kinds.SITE, options.site_file),
+ ) if value]
+
+ if not file_options:
+ if not need_value:
+ return None
+ # Default to user, unless there's a site file.
+ elif any(
+ os.path.exists(site_config_file)
+ for site_config_file in get_configuration_files()[kinds.SITE]
+ ):
+ return kinds.SITE
+ else:
+ return kinds.USER
+ elif len(file_options) == 1:
+ return file_options[0]
+
+ raise PipError(
+ "Need exactly one file to operate upon "
+ "(--user, --site, --global) to perform."
+ )
+
+ def list_values(self, options, args):
+ # type: (Values, List[str]) -> None
+ self._get_n_args(args, "list", n=0)
+
+ for key, value in sorted(self.configuration.items()):
+ write_output("%s=%r", key, value)
+
+ def get_name(self, options, args):
+ # type: (Values, List[str]) -> None
+ key = self._get_n_args(args, "get [name]", n=1)
+ value = self.configuration.get_value(key)
+
+ write_output("%s", value)
+
+ def set_name_value(self, options, args):
+ # type: (Values, List[str]) -> None
+ key, value = self._get_n_args(args, "set [name] [value]", n=2)
+ self.configuration.set_value(key, value)
+
+ self._save_configuration()
+
+ def unset_name(self, options, args):
+ # type: (Values, List[str]) -> None
+ key = self._get_n_args(args, "unset [name]", n=1)
+ self.configuration.unset_value(key)
+
+ self._save_configuration()
+
+ def list_config_values(self, options, args):
+ # type: (Values, List[str]) -> None
+ """List config key-value pairs across different config files"""
+ self._get_n_args(args, "debug", n=0)
+
+ self.print_env_var_values()
+ # Iterate over config files and print if they exist, and the
+ # key-value pairs present in them if they do
+ for variant, files in sorted(self.configuration.iter_config_files()):
+ write_output("%s:", variant)
+ for fname in files:
+ with indent_log():
+ file_exists = os.path.exists(fname)
+ write_output("%s, exists: %r",
+ fname, file_exists)
+ if file_exists:
+ self.print_config_file_values(variant)
+
+ def print_config_file_values(self, variant):
+ # type: (Kind) -> None
+ """Get key-value pairs from the file of a variant"""
+ for name, value in self.configuration.\
+ get_values_in_config(variant).items():
+ with indent_log():
+ write_output("%s: %s", name, value)
+
+ def print_env_var_values(self):
+ # type: () -> None
+ """Get key-values pairs present as environment variables"""
+ write_output("%s:", 'env_var')
+ with indent_log():
+ for key, value in sorted(self.configuration.get_environ_vars()):
+ env_var = f'PIP_{key.upper()}'
+ write_output("%s=%r", env_var, value)
+
+ def open_in_editor(self, options, args):
+ # type: (Values, List[str]) -> None
+ editor = self._determine_editor(options)
+
+ fname = self.configuration.get_file_to_edit()
+ if fname is None:
+ raise PipError("Could not determine appropriate file.")
+
+ try:
+ subprocess.check_call([editor, fname])
+ except subprocess.CalledProcessError as e:
+ raise PipError(
+ "Editor Subprocess exited with exit code {}"
+ .format(e.returncode)
+ )
+
+ def _get_n_args(self, args, example, n):
+ # type: (List[str], str, int) -> Any
+ """Helper to make sure the command got the right number of arguments
+ """
+ if len(args) != n:
+ msg = (
+ 'Got unexpected number of arguments, expected {}. '
+ '(example: "{} config {}")'
+ ).format(n, get_prog(), example)
+ raise PipError(msg)
+
+ if n == 1:
+ return args[0]
+ else:
+ return args
+
+ def _save_configuration(self):
+ # type: () -> None
+ # We successfully ran a modifying command. Need to save the
+ # configuration.
+ try:
+ self.configuration.save()
+ except Exception:
+ logger.exception(
+ "Unable to save configuration. Please report this as a bug."
+ )
+ raise PipError("Internal Error.")
+
+ def _determine_editor(self, options):
+ # type: (Values) -> str
+ if options.editor is not None:
+ return options.editor
+ elif "VISUAL" in os.environ:
+ return os.environ["VISUAL"]
+ elif "EDITOR" in os.environ:
+ return os.environ["EDITOR"]
+ else:
+ raise PipError("Could not determine editor to use.")
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/debug.py b/.venv/lib/python3.9/site-packages/pip/_internal/commands/debug.py
new file mode 100644
index 0000000..301436b
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/commands/debug.py
@@ -0,0 +1,218 @@
+import locale
+import logging
+import os
+import sys
+
+import pip._vendor
+from pip._vendor import pkg_resources
+from pip._vendor.certifi import where
+from pip._vendor.packaging.version import parse as parse_version
+
+from pip import __file__ as pip_location
+from pip._internal.cli import cmdoptions
+from pip._internal.cli.base_command import Command
+from pip._internal.cli.cmdoptions import make_target_python
+from pip._internal.cli.status_codes import SUCCESS
+from pip._internal.utils.logging import indent_log
+from pip._internal.utils.misc import get_pip_version
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from optparse import Values
+ from types import ModuleType
+ from typing import Dict, List, Optional
+
+ from pip._internal.configuration import Configuration
+
+logger = logging.getLogger(__name__)
+
+
+def show_value(name, value):
+ # type: (str, Optional[str]) -> None
+ logger.info('%s: %s', name, value)
+
+
+def show_sys_implementation():
+ # type: () -> None
+ logger.info('sys.implementation:')
+ implementation_name = sys.implementation.name
+ with indent_log():
+ show_value('name', implementation_name)
+
+
+def create_vendor_txt_map():
+ # type: () -> Dict[str, str]
+ vendor_txt_path = os.path.join(
+ os.path.dirname(pip_location),
+ '_vendor',
+ 'vendor.txt'
+ )
+
+ with open(vendor_txt_path) as f:
+ # Purge non version specifying lines.
+ # Also, remove any space prefix or suffixes (including comments).
+ lines = [line.strip().split(' ', 1)[0]
+ for line in f.readlines() if '==' in line]
+
+ # Transform into "module" -> version dict.
+ return dict(line.split('==', 1) for line in lines) # type: ignore
+
+
+def get_module_from_module_name(module_name):
+ # type: (str) -> ModuleType
+ # Module name can be uppercase in vendor.txt for some reason...
+ module_name = module_name.lower()
+ # PATCH: setuptools is actually only pkg_resources.
+ if module_name == 'setuptools':
+ module_name = 'pkg_resources'
+
+ __import__(
+ f'pip._vendor.{module_name}',
+ globals(),
+ locals(),
+ level=0
+ )
+ return getattr(pip._vendor, module_name)
+
+
+def get_vendor_version_from_module(module_name):
+ # type: (str) -> Optional[str]
+ module = get_module_from_module_name(module_name)
+ version = getattr(module, '__version__', None)
+
+ if not version:
+ # Try to find version in debundled module info
+ pkg_set = pkg_resources.WorkingSet([os.path.dirname(module.__file__)])
+ package = pkg_set.find(pkg_resources.Requirement.parse(module_name))
+ version = getattr(package, 'version', None)
+
+ return version
+
+
+def show_actual_vendor_versions(vendor_txt_versions):
+ # type: (Dict[str, str]) -> None
+ """Log the actual version and print extra info if there is
+ a conflict or if the actual version could not be imported.
+ """
+ for module_name, expected_version in vendor_txt_versions.items():
+ extra_message = ''
+ actual_version = get_vendor_version_from_module(module_name)
+ if not actual_version:
+ extra_message = ' (Unable to locate actual module version, using'\
+ ' vendor.txt specified version)'
+ actual_version = expected_version
+ elif parse_version(actual_version) != parse_version(expected_version):
+ extra_message = ' (CONFLICT: vendor.txt suggests version should'\
+ ' be {})'.format(expected_version)
+ logger.info('%s==%s%s', module_name, actual_version, extra_message)
+
+
+def show_vendor_versions():
+ # type: () -> None
+ logger.info('vendored library versions:')
+
+ vendor_txt_versions = create_vendor_txt_map()
+ with indent_log():
+ show_actual_vendor_versions(vendor_txt_versions)
+
+
+def show_tags(options):
+ # type: (Values) -> None
+ tag_limit = 10
+
+ target_python = make_target_python(options)
+ tags = target_python.get_tags()
+
+ # Display the target options that were explicitly provided.
+ formatted_target = target_python.format_given()
+ suffix = ''
+ if formatted_target:
+ suffix = f' (target: {formatted_target})'
+
+ msg = 'Compatible tags: {}{}'.format(len(tags), suffix)
+ logger.info(msg)
+
+ if options.verbose < 1 and len(tags) > tag_limit:
+ tags_limited = True
+ tags = tags[:tag_limit]
+ else:
+ tags_limited = False
+
+ with indent_log():
+ for tag in tags:
+ logger.info(str(tag))
+
+ if tags_limited:
+ msg = (
+ '...\n'
+ '[First {tag_limit} tags shown. Pass --verbose to show all.]'
+ ).format(tag_limit=tag_limit)
+ logger.info(msg)
+
+
+def ca_bundle_info(config):
+ # type: (Configuration) -> str
+ levels = set()
+ for key, _ in config.items():
+ levels.add(key.split('.')[0])
+
+ if not levels:
+ return "Not specified"
+
+ levels_that_override_global = ['install', 'wheel', 'download']
+ global_overriding_level = [
+ level for level in levels if level in levels_that_override_global
+ ]
+ if not global_overriding_level:
+ return 'global'
+
+ if 'global' in levels:
+ levels.remove('global')
+ return ", ".join(levels)
+
+
+class DebugCommand(Command):
+ """
+ Display debug information.
+ """
+
+ usage = """
+ %prog """
+ ignore_require_venv = True
+
+ def add_options(self):
+ # type: () -> None
+ cmdoptions.add_target_python_options(self.cmd_opts)
+ self.parser.insert_option_group(0, self.cmd_opts)
+ self.parser.config.load()
+
+ def run(self, options, args):
+ # type: (Values, List[str]) -> int
+ logger.warning(
+ "This command is only meant for debugging. "
+ "Do not use this with automation for parsing and getting these "
+ "details, since the output and options of this command may "
+ "change without notice."
+ )
+ show_value('pip version', get_pip_version())
+ show_value('sys.version', sys.version)
+ show_value('sys.executable', sys.executable)
+ show_value('sys.getdefaultencoding', sys.getdefaultencoding())
+ show_value('sys.getfilesystemencoding', sys.getfilesystemencoding())
+ show_value(
+ 'locale.getpreferredencoding', locale.getpreferredencoding(),
+ )
+ show_value('sys.platform', sys.platform)
+ show_sys_implementation()
+
+ show_value("'cert' config value", ca_bundle_info(self.parser.config))
+ show_value("REQUESTS_CA_BUNDLE", os.environ.get('REQUESTS_CA_BUNDLE'))
+ show_value("CURL_CA_BUNDLE", os.environ.get('CURL_CA_BUNDLE'))
+ show_value("pip._vendor.certifi.where()", where())
+ show_value("pip._vendor.DEBUNDLED", pip._vendor.DEBUNDLED)
+
+ show_vendor_versions()
+
+ show_tags(options)
+
+ return SUCCESS
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/download.py b/.venv/lib/python3.9/site-packages/pip/_internal/commands/download.py
new file mode 100644
index 0000000..8e3e077
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/commands/download.py
@@ -0,0 +1,144 @@
+import logging
+import os
+
+from pip._internal.cli import cmdoptions
+from pip._internal.cli.cmdoptions import make_target_python
+from pip._internal.cli.req_command import RequirementCommand, with_cleanup
+from pip._internal.cli.status_codes import SUCCESS
+from pip._internal.req.req_tracker import get_requirement_tracker
+from pip._internal.utils.misc import ensure_dir, normalize_path, write_output
+from pip._internal.utils.temp_dir import TempDirectory
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from optparse import Values
+ from typing import List
+
+logger = logging.getLogger(__name__)
+
+
+class DownloadCommand(RequirementCommand):
+ """
+ Download packages from:
+
+ - PyPI (and other indexes) using requirement specifiers.
+ - VCS project urls.
+ - Local project directories.
+ - Local or remote source archives.
+
+ pip also supports downloading from "requirements files", which provide
+ an easy way to specify a whole environment to be downloaded.
+ """
+
+ usage = """
+ %prog [options] [package-index-options] ...
+ %prog [options] -r [package-index-options] ...
+ %prog [options] ...
+ %prog [options] ...
+ %prog [options] ..."""
+
+ def add_options(self):
+ # type: () -> None
+ self.cmd_opts.add_option(cmdoptions.constraints())
+ self.cmd_opts.add_option(cmdoptions.requirements())
+ self.cmd_opts.add_option(cmdoptions.build_dir())
+ self.cmd_opts.add_option(cmdoptions.no_deps())
+ self.cmd_opts.add_option(cmdoptions.global_options())
+ self.cmd_opts.add_option(cmdoptions.no_binary())
+ self.cmd_opts.add_option(cmdoptions.only_binary())
+ self.cmd_opts.add_option(cmdoptions.prefer_binary())
+ self.cmd_opts.add_option(cmdoptions.src())
+ self.cmd_opts.add_option(cmdoptions.pre())
+ self.cmd_opts.add_option(cmdoptions.require_hashes())
+ self.cmd_opts.add_option(cmdoptions.progress_bar())
+ self.cmd_opts.add_option(cmdoptions.no_build_isolation())
+ self.cmd_opts.add_option(cmdoptions.use_pep517())
+ self.cmd_opts.add_option(cmdoptions.no_use_pep517())
+ self.cmd_opts.add_option(cmdoptions.ignore_requires_python())
+
+ self.cmd_opts.add_option(
+ '-d', '--dest', '--destination-dir', '--destination-directory',
+ dest='download_dir',
+ metavar='dir',
+ default=os.curdir,
+ help=("Download packages into ."),
+ )
+
+ cmdoptions.add_target_python_options(self.cmd_opts)
+
+ index_opts = cmdoptions.make_option_group(
+ cmdoptions.index_group,
+ self.parser,
+ )
+
+ self.parser.insert_option_group(0, index_opts)
+ self.parser.insert_option_group(0, self.cmd_opts)
+
+ @with_cleanup
+ def run(self, options, args):
+ # type: (Values, List[str]) -> int
+
+ options.ignore_installed = True
+ # editable doesn't really make sense for `pip download`, but the bowels
+ # of the RequirementSet code require that property.
+ options.editables = []
+
+ cmdoptions.check_dist_restriction(options)
+
+ options.download_dir = normalize_path(options.download_dir)
+ ensure_dir(options.download_dir)
+
+ session = self.get_default_session(options)
+
+ target_python = make_target_python(options)
+ finder = self._build_package_finder(
+ options=options,
+ session=session,
+ target_python=target_python,
+ ignore_requires_python=options.ignore_requires_python,
+ )
+
+ req_tracker = self.enter_context(get_requirement_tracker())
+
+ directory = TempDirectory(
+ delete=not options.no_clean,
+ kind="download",
+ globally_managed=True,
+ )
+
+ reqs = self.get_requirements(args, options, finder, session)
+
+ preparer = self.make_requirement_preparer(
+ temp_build_dir=directory,
+ options=options,
+ req_tracker=req_tracker,
+ session=session,
+ finder=finder,
+ download_dir=options.download_dir,
+ use_user_site=False,
+ )
+
+ resolver = self.make_resolver(
+ preparer=preparer,
+ finder=finder,
+ options=options,
+ ignore_requires_python=options.ignore_requires_python,
+ py_version_info=options.python_version,
+ )
+
+ self.trace_basic_info(finder)
+
+ requirement_set = resolver.resolve(
+ reqs, check_supported_wheels=True
+ )
+
+ downloaded = [] # type: List[str]
+ for req in requirement_set.requirements.values():
+ if req.satisfied_by is None:
+ assert req.name is not None
+ preparer.save_linked_requirement(req)
+ downloaded.append(req.name)
+ if downloaded:
+ write_output('Successfully downloaded %s', ' '.join(downloaded))
+
+ return SUCCESS
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/freeze.py b/.venv/lib/python3.9/site-packages/pip/_internal/commands/freeze.py
new file mode 100644
index 0000000..6eb1db2
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/commands/freeze.py
@@ -0,0 +1,107 @@
+import sys
+
+from pip._internal.cli import cmdoptions
+from pip._internal.cli.base_command import Command
+from pip._internal.cli.status_codes import SUCCESS
+from pip._internal.operations.freeze import freeze
+from pip._internal.utils.compat import stdlib_pkgs
+from pip._internal.utils.deprecation import deprecated
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+DEV_PKGS = {'pip', 'setuptools', 'distribute', 'wheel'}
+
+if MYPY_CHECK_RUNNING:
+ from optparse import Values
+ from typing import List
+
+
+class FreezeCommand(Command):
+ """
+ Output installed packages in requirements format.
+
+ packages are listed in a case-insensitive sorted order.
+ """
+
+ usage = """
+ %prog [options]"""
+ log_streams = ("ext://sys.stderr", "ext://sys.stderr")
+
+ def add_options(self):
+ # type: () -> None
+ self.cmd_opts.add_option(
+ '-r', '--requirement',
+ dest='requirements',
+ action='append',
+ default=[],
+ metavar='file',
+ help="Use the order in the given requirements file and its "
+ "comments when generating output. This option can be "
+ "used multiple times.")
+ self.cmd_opts.add_option(
+ '-f', '--find-links',
+ dest='find_links',
+ action='append',
+ default=[],
+ metavar='URL',
+ help='URL for finding packages, which will be added to the '
+ 'output.')
+ self.cmd_opts.add_option(
+ '-l', '--local',
+ dest='local',
+ action='store_true',
+ default=False,
+ help='If in a virtualenv that has global access, do not output '
+ 'globally-installed packages.')
+ self.cmd_opts.add_option(
+ '--user',
+ dest='user',
+ action='store_true',
+ default=False,
+ help='Only output packages installed in user-site.')
+ self.cmd_opts.add_option(cmdoptions.list_path())
+ self.cmd_opts.add_option(
+ '--all',
+ dest='freeze_all',
+ action='store_true',
+ help='Do not skip these packages in the output:'
+ ' {}'.format(', '.join(DEV_PKGS)))
+ self.cmd_opts.add_option(
+ '--exclude-editable',
+ dest='exclude_editable',
+ action='store_true',
+ help='Exclude editable package from output.')
+ self.cmd_opts.add_option(cmdoptions.list_exclude())
+
+ self.parser.insert_option_group(0, self.cmd_opts)
+
+ def run(self, options, args):
+ # type: (Values, List[str]) -> int
+ skip = set(stdlib_pkgs)
+ if not options.freeze_all:
+ skip.update(DEV_PKGS)
+
+ if options.excludes:
+ skip.update(options.excludes)
+
+ cmdoptions.check_list_path_option(options)
+
+ if options.find_links:
+ deprecated(
+ "--find-links option in pip freeze is deprecated.",
+ replacement=None,
+ gone_in="21.2",
+ issue=9069,
+ )
+
+ for line in freeze(
+ requirement=options.requirements,
+ find_links=options.find_links,
+ local_only=options.local,
+ user_only=options.user,
+ paths=options.path,
+ isolated=options.isolated_mode,
+ skip=skip,
+ exclude_editable=options.exclude_editable,
+ ):
+ sys.stdout.write(line + '\n')
+ return SUCCESS
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/hash.py b/.venv/lib/python3.9/site-packages/pip/_internal/commands/hash.py
new file mode 100644
index 0000000..891e393
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/commands/hash.py
@@ -0,0 +1,61 @@
+import hashlib
+import logging
+import sys
+
+from pip._internal.cli.base_command import Command
+from pip._internal.cli.status_codes import ERROR, SUCCESS
+from pip._internal.utils.hashes import FAVORITE_HASH, STRONG_HASHES
+from pip._internal.utils.misc import read_chunks, write_output
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from optparse import Values
+ from typing import List
+
+logger = logging.getLogger(__name__)
+
+
+class HashCommand(Command):
+ """
+ Compute a hash of a local package archive.
+
+ These can be used with --hash in a requirements file to do repeatable
+ installs.
+ """
+
+ usage = '%prog [options] ...'
+ ignore_require_venv = True
+
+ def add_options(self):
+ # type: () -> None
+ self.cmd_opts.add_option(
+ '-a', '--algorithm',
+ dest='algorithm',
+ choices=STRONG_HASHES,
+ action='store',
+ default=FAVORITE_HASH,
+ help='The hash algorithm to use: one of {}'.format(
+ ', '.join(STRONG_HASHES)))
+ self.parser.insert_option_group(0, self.cmd_opts)
+
+ def run(self, options, args):
+ # type: (Values, List[str]) -> int
+ if not args:
+ self.parser.print_usage(sys.stderr)
+ return ERROR
+
+ algorithm = options.algorithm
+ for path in args:
+ write_output('%s:\n--hash=%s:%s',
+ path, algorithm, _hash_of_file(path, algorithm))
+ return SUCCESS
+
+
+def _hash_of_file(path, algorithm):
+ # type: (str, str) -> str
+ """Return the hash digest of a file."""
+ with open(path, 'rb') as archive:
+ hash = hashlib.new(algorithm)
+ for chunk in read_chunks(archive):
+ hash.update(chunk)
+ return hash.hexdigest()
diff --git a/.venv/lib/python3.9/site-packages/pip/_internal/commands/help.py b/.venv/lib/python3.9/site-packages/pip/_internal/commands/help.py
new file mode 100644
index 0000000..4d83c52
--- /dev/null
+++ b/.venv/lib/python3.9/site-packages/pip/_internal/commands/help.py
@@ -0,0 +1,44 @@
+from pip._internal.cli.base_command import Command
+from pip._internal.cli.status_codes import SUCCESS
+from pip._internal.exceptions import CommandError
+from pip._internal.utils.typing import MYPY_CHECK_RUNNING
+
+if MYPY_CHECK_RUNNING:
+ from optparse import Values
+ from typing import List
+
+
+class HelpCommand(Command):
+ """Show help for commands"""
+
+ usage = """
+ %prog