Skip to content

Commit 9db1b1b

Browse files
committed
docs: Add Sponsors
1 parent fc7e354 commit 9db1b1b

File tree

1 file changed

+19
-5
lines changed

1 file changed

+19
-5
lines changed

README.md

+19-5
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,6 @@
1111
<img src="media/cover.png" alt="Your image description">
1212
</p>
1313

14-
</br>
15-
1614
## Why is this course different?
1715

1816
*By finishing the **"LLM Twin: Building Your Production-Ready AI Replica"** free course, you will learn how to design, train, and deploy a production-ready LLM twin of yourself powered by LLMs, vector DBs, and LLMOps good practices.*
@@ -67,22 +65,26 @@ You will also **learn** to **leverage MLOps best practices**, such as experiment
6765
### The feature pipeline
6866

6967
- Consume messages from a queue through a [Bytewax](https://github.com/bytewax/bytewax?utm_source=github&utm_medium=decodingml&utm_campaign=2024_q1) streaming pipeline.
70-
- Every message will be cleaned, chunked, embedded (using [Superlinked](https://github.com/superlinked/superlinked-alpha?utm_source=community&utm_medium=github&utm_campaign=oscourse), and loaded into a [Qdrant](https://qdrant.tech/?utm_source=decodingml&utm_medium=referral&utm_campaign=llm-course) vector DB in real-time.
68+
- Every message will be cleaned, chunked, embedded and loaded into a [Qdrant](https://qdrant.tech/?utm_source=decodingml&utm_medium=referral&utm_campaign=llm-course) vector DB in real-time.
69+
In the bonus series, we refactor the cleaning, chunking, and embedding logic using [Superlinked](https://rebrand.ly/superlinked-github), a specialized vector compute engine. We will also load and index the vectors to [Redis vector search](https://redis.io/solutions/vector-search/).
7170
- ☁️ Deployed on [AWS](https://aws.amazon.com/).
7271

7372
### The training pipeline
73+
7474
- Create a custom dataset based on your digital data.
7575
- Fine-tune an LLM using QLoRA.
7676
- Use [Comet ML's](https://www.comet.com/signup/?utm_source=decoding_ml&utm_medium=partner&utm_content=github) experiment tracker to monitor the experiments.
7777
- Evaluate and save the best model to [Comet's](https://www.comet.com/signup/?utm_source=decoding_ml&utm_medium=partner&utm_content=github) model registry.
7878
- ☁️ Deployed on [Qwak](https://www.qwak.com/lp/end-to-end-mlops/?utm_source=github&utm_medium=referral&utm_campaign=decodingml).
7979

8080
### The inference pipeline
81-
- Load and quantize the fine-tuned LLM from [Comet's](https://www.comet.com/signup/?utm_source=decoding_ml&utm_medium=partner&utm_content=github) model registry.
81+
82+
- Load the fine-tuned LLM from [Comet's](https://www.comet.com/signup/?utm_source=decoding_ml&utm_medium=partner&utm_content=github) model registry.
8283
- Deploy it as a REST API.
83-
- Enhance the prompts using RAG.
84+
- Enhance the prompts using advanced RAG.
8485
- Generate content using your LLM twin.
8586
- Monitor the LLM using [Comet's](https://www.comet.com/signup/?framework=llm&utm_source=decoding_ml&utm_medium=partner&utm_content=github) prompt monitoring dashboard.
87+
- In the bonus series, we refactor the advanced RAG layer to write more optimal queries using [Superlinked](https://rebrand.ly/superlinked-github).
8688
- ☁️ Deployed on [Qwak](https://www.qwak.com/lp/end-to-end-mlops/?utm_source=github&utm_medium=referral&utm_campaign=decodingml).
8789

8890
</br>
@@ -225,3 +227,15 @@ A big "Thank you 🙏" to all our contributors! This course is possible only bec
225227
<img src="https://contrib.rocks/image?repo=decodingml/llm-twin-course" />
226228
</a>
227229
</p>
230+
231+
## Sponsors
232+
233+
<table>
234+
<tr>
235+
<td align="center"><img src="media/sponsors/comet.png" width="150" alt="Image 1"></td>
236+
<td align="center"><img src="media/sponsors/bytewax.png" width="150" alt="Image 2"></td>
237+
<td align="center"><img src="media/sponsors/qdrant.svg" width="150" alt="Image 3"></td>
238+
<td align="center"><img src="media/sponsors/qwak.png" width="150" alt="Image 4"></td>
239+
<td align="center"><img src="media/sponsors/superlinked.png" width="150" alt="Image 5"></td>
240+
</tr>
241+
</table>

0 commit comments

Comments
 (0)