Update README.md
Browse files
README.md
CHANGED
|
@@ -1,26 +1,43 @@
|
|
| 1 |
---
|
| 2 |
language:
|
| 3 |
- en
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 4 |
---
|
| 5 |
# Model Card for Model ID
|
| 6 |
|
| 7 |
Development of Data Compression Tools for Maintenance and Utilization of Large-scale Research Facilities
|
| 8 |
|
| 9 |
-
This modelcard aims to be a base template for new models. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1).
|
| 10 |
-
|
| 11 |
## Model Details
|
| 12 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 13 |
### Model Description
|
| 14 |
|
| 15 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 16 |
|
| 17 |
|
| 18 |
|
| 19 |
-
- **Developed by:**
|
| 20 |
- **Funded by [optional]:** [More Information Needed]
|
| 21 |
-
- **Shared by [optional]:**
|
| 22 |
-
- **Model type:**
|
| 23 |
-
- **Language(s) (NLP):**
|
| 24 |
- **License:** [More Information Needed]
|
| 25 |
- **Finetuned from model [optional]:** [More Information Needed]
|
| 26 |
|
|
@@ -28,7 +45,7 @@ This modelcard aims to be a base template for new models. It has been generated
|
|
| 28 |
|
| 29 |
<!-- Provide the basic links for the model. -->
|
| 30 |
|
| 31 |
-
- **Repository:**
|
| 32 |
- **Paper [optional]:** [More Information Needed]
|
| 33 |
- **Demo [optional]:** [More Information Needed]
|
| 34 |
|
|
|
|
| 1 |
---
|
| 2 |
language:
|
| 3 |
- en
|
| 4 |
+
tags:
|
| 5 |
+
- data
|
| 6 |
+
- compression
|
| 7 |
+
- training
|
| 8 |
+
- decompression
|
| 9 |
---
|
| 10 |
# Model Card for Model ID
|
| 11 |
|
| 12 |
Development of Data Compression Tools for Maintenance and Utilization of Large-scale Research Facilities
|
| 13 |
|
|
|
|
|
|
|
| 14 |
## Model Details
|
| 15 |
|
| 16 |
+
- **Learning mechanism
|
| 17 |
+
- **Compression mechanism
|
| 18 |
+
- **Decompression mechanism
|
| 19 |
+
|
| 20 |
+
|
| 21 |
### Model Description
|
| 22 |
|
| 23 |
+
Learning mechanism
|
| 24 |
+
PredNet, Convlstm is used to learn the change in the movement of an object over time. According to the learning method of PredNet, the learning data is converted into the hkl format and then learned. The learned model is output to a file. This file is used by the compression mechanism and decompression mechanism. Use another program to download the training data and convert it to hkl. The details are explained in section “Learning mechanism” below.
|
| 25 |
+
|
| 26 |
+
Compression mechanism
|
| 27 |
+
Using the model output by the learning mechanism, the results of inference and difference of time series images are compressed. After deriving the difference between the original image and the inference result,error-bounded quantization, Density-based Spatial Encoding, and Partitioned Entropy Encoding are processed. These processes have the effect of increasing the compression rate when compressing. Use the zstd library to compress and output to a binary file (.dat).
|
| 28 |
+
|
| 29 |
+
And,differences and keyframe images are also output to a binary file (.dat) using the zstd library.
|
| 30 |
+
|
| 31 |
+
Decompression mechanism
|
| 32 |
+
Using the model output by the learning mechanism and the binary file (.dat) output by the compression mechanism, the image group input to the compression mechanism is restored. By inferring by inputting keyframes, the inference result of the compression mechanism is reproduced. The processing of Density-based Spatial Decoding and Partitioned Entropy Decoding is performed in the reverse order of the compression mechanism, and the original difference is restored. Since the error-bounded quantization process is lossy compression, it is not included in the decompression mechanism. The inference result and the difference are added to restore the original image and output it.
|
| 33 |
|
| 34 |
|
| 35 |
|
| 36 |
+
- **Developed by:** Mina
|
| 37 |
- **Funded by [optional]:** [More Information Needed]
|
| 38 |
+
- **Shared by [optional]:** Amarjit Singh
|
| 39 |
+
- **Model type:** .pt model files
|
| 40 |
+
- **Language(s) (NLP):** Libtorch
|
| 41 |
- **License:** [More Information Needed]
|
| 42 |
- **Finetuned from model [optional]:** [More Information Needed]
|
| 43 |
|
|
|
|
| 45 |
|
| 46 |
<!-- Provide the basic links for the model. -->
|
| 47 |
|
| 48 |
+
- **Repository:** (https://github.com/mina98/TEZip-Libtorch-Main.git)
|
| 49 |
- **Paper [optional]:** [More Information Needed]
|
| 50 |
- **Demo [optional]:** [More Information Needed]
|
| 51 |
|