Update README.md
Browse files
README.md
CHANGED
|
@@ -4,6 +4,11 @@ license: mit
|
|
| 4 |
datasets:
|
| 5 |
- liuhaotian/LLaVA-Instruct-150K
|
| 6 |
- lmms-lab/COCO-Caption
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 7 |
---
|
| 8 |
# BobVLM β¨π
|
| 9 |
[[Article on Medium](https://medium.com/p/7d51099bfbcb/edit)] [[Package on Github](https://github.com/logic-OT/BobVLM)]
|
|
@@ -24,7 +29,7 @@ To maintain efficiency and accessibility:
|
|
| 24 |
- Can be trained on accessible hardware (T4 or P100 GPUs)
|
| 25 |
|
| 26 |
## Demo
|
| 27 |
-
Check out the awesome demo here ππ: [
|
| 28 |
|
| 29 |
|
| 30 |
## Installation
|
|
|
|
| 4 |
datasets:
|
| 5 |
- liuhaotian/LLaVA-Instruct-150K
|
| 6 |
- lmms-lab/COCO-Caption
|
| 7 |
+
base_model:
|
| 8 |
+
- meta-llama/Llama-3.2-1B
|
| 9 |
+
- openai/clip-vit-large-patch14
|
| 10 |
+
tags:
|
| 11 |
+
- code
|
| 12 |
---
|
| 13 |
# BobVLM β¨π
|
| 14 |
[[Article on Medium](https://medium.com/p/7d51099bfbcb/edit)] [[Package on Github](https://github.com/logic-OT/BobVLM)]
|
|
|
|
| 29 |
- Can be trained on accessible hardware (T4 or P100 GPUs)
|
| 30 |
|
| 31 |
## Demo
|
| 32 |
+
Check out the awesome demo here ππ: [Demo on huggingface](https://huggingface.co/selfDotOsman/BobVLM-1.5b)
|
| 33 |
|
| 34 |
|
| 35 |
## Installation
|