diff --git a/docs/_posts/ahmedlone127/2024-11-26-mini_cpm_2b_8bit_xx.md b/docs/_posts/ahmedlone127/2024-11-26-mini_cpm_2b_8bit_xx.md
new file mode 100644
index 00000000000000..60390336578d3a
--- /dev/null
+++ b/docs/_posts/ahmedlone127/2024-11-26-mini_cpm_2b_8bit_xx.md
@@ -0,0 +1,86 @@
+---
+layout: model
+title: mini_cpm_2b_8bit model from
+author: John Snow Labs
+name: mini_cpm_2b_8bit
+date: 2024-11-26
+tags: [en, open_source, pipeline, openvino, xx]
+task: Text Generation
+language: xx
+edition: Spark NLP 5.5.1
+spark_version: 3.0
+supported: true
+engine: openvino
+annotator: CPMTransformer
+article_header:
+ type: cover
+use_language_switcher: "Python-Scala-Java"
+---
+
+## Description
+
+Pretrained CPMTransformer, adapted from Hugging Face and curated to provide scalability and production-readiness using Spark NLP.`mini_cpm_2b_8bit` is a multilingual model originally trained by openbmb.
+
+{:.btn-box}
+
+
+[Download](https://s3.amazonaws.com/auxdata.johnsnowlabs.com/public/models/mini_cpm_2b_8bit_xx_5.5.1_3.0_1732658809236.zip){:.button.button-orange.button-orange-trans.arr.button-icon}
+[Copy S3 URI](s3://auxdata.johnsnowlabs.com/public/models/mini_cpm_2b_8bit_xx_5.5.1_3.0_1732658809236.zip){:.button.button-orange.button-orange-trans.button-icon.button-copy-s3}
+
+## How to use
+
+
+
+
+
+{:.model-param}
+## Model Information
+
+{:.table-model}
+|---|---|
+|Model Name:|mini_cpm_2b_8bit|
+|Compatibility:|Spark NLP 5.5.1+|
+|License:|Open Source|
+|Edition:|Official|
+|Input Labels:|[documents]|
+|Output Labels:|[generation]|
+|Language:|xx|
+|Size:|3.0 GB|
+
+## References
+
+https://huggingface.co/openbmb/MiniCPM-2B-dpo-bf16
\ No newline at end of file
diff --git a/docs/_posts/ahmedlone127/2024-11-27-nllb_distilled_600M_8int_xx.md b/docs/_posts/ahmedlone127/2024-11-27-nllb_distilled_600M_8int_xx.md
new file mode 100644
index 00000000000000..fde6f004b253c4
--- /dev/null
+++ b/docs/_posts/ahmedlone127/2024-11-27-nllb_distilled_600M_8int_xx.md
@@ -0,0 +1,86 @@
+---
+layout: model
+title: nllb_distilled_600M_8int model from Facebook
+author: John Snow Labs
+name: nllb_distilled_600M_8int
+date: 2024-11-27
+tags: [en, open_source, pipeline, openvino, xx]
+task: Text Generation
+language: xx
+edition: Spark NLP 5.5.1
+spark_version: 3.0
+supported: true
+engine: openvino
+annotator: NLLBTransformer
+article_header:
+ type: cover
+use_language_switcher: "Python-Scala-Java"
+---
+
+## Description
+
+Pretrained NLLBTransformer, adapted from Hugging Face and curated to provide scalability and production-readiness using Spark NLP.`nllb_distilled_600M_8int` is a Multilingual model originally trained by facebook.
+
+{:.btn-box}
+
+
+[Download](https://s3.amazonaws.com/auxdata.johnsnowlabs.com/public/models/nllb_distilled_600M_8int_xx_5.5.1_3.0_1732741416718.zip){:.button.button-orange.button-orange-trans.arr.button-icon}
+[Copy S3 URI](s3://auxdata.johnsnowlabs.com/public/models/nllb_distilled_600M_8int_xx_5.5.1_3.0_1732741416718.zip){:.button.button-orange.button-orange-trans.button-icon.button-copy-s3}
+
+## How to use
+
+
+
+
+
+{:.model-param}
+## Model Information
+
+{:.table-model}
+|---|---|
+|Model Name:|nomic_embed_v1|
+|Compatibility:|Spark NLP 5.5.1+|
+|License:|Open Source|
+|Edition:|Official|
+|Input Labels:|[documents]|
+|Output Labels:|[generation]|
+|Language:|en|
+|Size:|255.0 MB|
+
+## References
+
+https://huggingface.co/nomic-ai/nomic-embed-text-v1
\ No newline at end of file
diff --git a/docs/_posts/ahmedlone127/2024-11-29-phi_3_mini_128k_instruct_en.md b/docs/_posts/ahmedlone127/2024-11-29-phi_3_mini_128k_instruct_en.md
new file mode 100644
index 00000000000000..fae7ad5900a6be
--- /dev/null
+++ b/docs/_posts/ahmedlone127/2024-11-29-phi_3_mini_128k_instruct_en.md
@@ -0,0 +1,86 @@
+---
+layout: model
+title: phi_3_mini_128k_instruct model from microsoft
+author: John Snow Labs
+name: phi_3_mini_128k_instruct
+date: 2024-11-29
+tags: [en, open_source, openvino]
+task: Text Generation
+language: en
+edition: Spark NLP 5.5.1
+spark_version: 3.0
+supported: true
+engine: openvino
+annotator: Phi3Transformer
+article_header:
+ type: cover
+use_language_switcher: "Python-Scala-Java"
+---
+
+## Description
+
+Pretrained Phi3Transformer, adapted from Hugging Face and curated to provide scalability and production-readiness using Spark NLP.`phi_3_mini_128k_instruct` is a english model originally trained by openbmb.
+
+{:.btn-box}
+
+
+[Download](https://s3.amazonaws.com/auxdata.johnsnowlabs.com/public/models/phi_3_mini_128k_instruct_en_5.5.1_3.0_1732897700551.zip){:.button.button-orange.button-orange-trans.arr.button-icon}
+[Copy S3 URI](s3://auxdata.johnsnowlabs.com/public/models/phi_3_mini_128k_instruct_en_5.5.1_3.0_1732897700551.zip){:.button.button-orange.button-orange-trans.button-icon.button-copy-s3}
+
+## How to use
+
+
+
+
+
+{:.model-param}
+## Model Information
+
+{:.table-model}
+|---|---|
+|Model Name:|qwen_7.5b_chat|
+|Compatibility:|Spark NLP 5.5.1+|
+|License:|Open Source|
+|Edition:|Official|
+|Input Labels:|[documents]|
+|Output Labels:|[generation]|
+|Language:|en|
+|Size:|7.0 GB|
+
+## References
+
+https://huggingface.co/Qwen/Qwen1.5-7B-Chat
\ No newline at end of file
diff --git a/docs/_posts/gadde5300/2024-11-20-bert_embeddings_sec_bert_base_en.md b/docs/_posts/gadde5300/2024-11-20-bert_embeddings_sec_bert_base_en.md
new file mode 100644
index 00000000000000..da7734ed9dcdbd
--- /dev/null
+++ b/docs/_posts/gadde5300/2024-11-20-bert_embeddings_sec_bert_base_en.md
@@ -0,0 +1,105 @@
+---
+layout: model
+title: Financial English BERT Embeddings (Base)
+author: John Snow Labs
+name: bert_embeddings_sec_bert_base
+date: 2024-11-20
+tags: [financial, bert, en, embeddings, open_source, tensorflow]
+task: Embeddings
+language: en
+edition: Spark NLP 5.5.1
+spark_version: 3.0
+supported: true
+engine: tensorflow
+annotator: BertEmbeddings
+article_header:
+ type: cover
+use_language_switcher: "Python-Scala-Java"
+---
+
+## Description
+
+Financial Pretrained BERT Embeddings model, uploaded to Hugging Face, adapted and imported into Spark NLP. `sec-bert-base` is a English model orginally trained by `nlpaueb`. This is the reference base model, what means it uses the same architecture as BERT-BASE trained on financial documents.
+
+## Predicted Entities
+
+
+
+{:.btn-box}
+
+
+[Download](https://s3.amazonaws.com/auxdata.johnsnowlabs.com/public/models/bert_embeddings_sec_bert_base_en_5.5.1_3.0_1732064992710.zip){:.button.button-orange.button-orange-trans.arr.button-icon}
+[Copy S3 URI](s3://auxdata.johnsnowlabs.com/public/models/bert_embeddings_sec_bert_base_en_5.5.1_3.0_1732064992710.zip){:.button.button-orange.button-orange-trans.button-icon.button-copy-s3}
+
+## How to use
+
+
+
+