update readme
Browse files
README.md
CHANGED
|
@@ -8,3 +8,34 @@ tags:
|
|
| 8 |
- time series foundation models
|
| 9 |
- time-series
|
| 10 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 8 |
- time series foundation models
|
| 9 |
- time-series
|
| 10 |
---
|
| 11 |
+
|
| 12 |
+
# Chronos-T5 Mini
|
| 13 |
+
|
| 14 |
+
Chronos models are pre-trained **time series forecasting models** based on language model architectures.
|
| 15 |
+
A time series is transformed into a sequence of tokens via scaling and quantization, and forecasts are obtained by sampling multiple sequences of future observations given historical context.
|
| 16 |
+
Chronos models are trained on a large corpus of publicly available time series data, as well as synthetic data.
|
| 17 |
+
|
| 18 |
+
For details on Chronos models, training data and procedures, and experimental results, refer to the paper [Chronos: Learning the Language of Time Series](https://www.example.com/).
|
| 19 |
+
|
| 20 |
+
## Architecture
|
| 21 |
+
|
| 22 |
+
The model in this repository is based on the [T5 architecture](https://arxiv.org/abs/1910.10683). The only difference is in the vocabulary size: Chronos-T5 uses 4096 different tokens, compared to 32128 of the original T5 models, resulting in a smaller number of total parameters.
|
| 23 |
+
|
| 24 |
+
Model | Parameters | Based on
|
| 25 |
+
----------------|-------------------|----------------------
|
| 26 |
+
[chronos-t5-mini](https://huggingface.co/amazon/chronos-t5-mini) | 20M | [t5-efficient-mini](https://huggingface.co/google/t5-efficient-mini)
|
| 27 |
+
[chronos-t5-small](https://huggingface.co/amazon/chronos-t5-small) | 46M | [flan-t5-small](https://huggingface.co/google/flan-t5-small)
|
| 28 |
+
[chronos-t5-base](https://huggingface.co/amazon/chronos-t5-base) | 200M | [flan-t5-base](https://huggingface.co/google/flan-t5-base)
|
| 29 |
+
[chronos-t5-large](https://huggingface.co/amazon/chronos-t5-large) | 710M | [flan-t5-large](https://huggingface.co/google/flan-t5-large)
|
| 30 |
+
|
| 31 |
+
## Usage
|
| 32 |
+
|
| 33 |
+
To do inference with Chronos models, refer to the code and examples in the [companion GitHub repo](https://www.example.com/).
|
| 34 |
+
|
| 35 |
+
## References
|
| 36 |
+
|
| 37 |
+
If you find Chronos models useful for your research, please consider citing the associated [paper](https://www.example.com/):
|
| 38 |
+
|
| 39 |
+
```
|
| 40 |
+
paper citation
|
| 41 |
+
```
|