developy commited on
Commit
76a9250
ยท
verified ยท
1 Parent(s): 1cc34ec

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -42
README.md CHANGED
@@ -1,31 +1,12 @@
1
- ---
2
- license: apache-2.0
3
- language:
4
- - en
5
- metrics:
6
- - accuracy
7
- base_model:
8
- - prs-eth/marigold-depth-v1-0
9
- tags:
10
- - code
11
- ---
12
  # ApDepth: Aiming for Precise Monocular Depth Estimation Based on Diffusion Models
13
 
14
  This repository is based on [Marigold](https://marigoldmonodepth.github.io), CVPR 2024 Best Paper: [**Repurposing Diffusion-Based Image Generators for Monocular Depth Estimation**](https://arxiv.org/abs/2312.02145)
15
 
16
- <!-- [![Website](doc/badges/badge-website.svg)](https://marigoldmonodepth.github.io)
17
- [![Paper](https://img.shields.io/badge/arXiv-PDF-b31b1b)](https://arxiv.org/abs/2312.02145)
18
- [![Hugging Face (LCM) Space](https://img.shields.io/badge/๐Ÿค—%20Hugging%20Face%20(LCM)-Space-yellow)](https://huggingface.co/spaces/prs-eth/marigold-lcm)
19
- [![Hugging Face (LCM) Model](https://img.shields.io/badge/๐Ÿค—%20Hugging%20Face%20(LCM)-Model-green)](https://huggingface.co/prs-eth/marigold-lcm-v1-0)
20
- [![Open In Colab](doc/badges/badge-colab.svg)](https://colab.research.google.com/drive/12G8reD13DdpMie5ZQlaFNo2WCGeNUH-u?usp=sharing) -->
21
  [![Website](doc/badges/badge-website.svg)](https://haruko386.github.io/research)
22
  [![License](https://img.shields.io/badge/License-Apache--2.0-929292)](https://www.apache.org/licenses/LICENSE-2.0)
23
  [![Static Badge](https://img.shields.io/badge/build-Haruko386-brightgreen?style=flat&logo=steam&logoColor=white&logoSize=auto&label=steam&labelColor=black&color=gray&cacheSeconds=3600)](https://steamcommunity.com/profiles/76561198217881431/)
24
- <!-- [![Hugging Face Model](https://img.shields.io/badge/๐Ÿค—%20Hugging%20Face-Model-green)](https://huggingface.co/prs-eth/marigold-v1-0) -->
25
- <!-- [![Website](https://img.shields.io/badge/Project-Website-1081c2)](https://arxiv.org/abs/2312.02145) -->
26
- <!-- [![GitHub](https://img.shields.io/github/stars/prs-eth/Marigold?style=default&label=GitHub%20โ˜…&logo=github)](https://github.com/prs-eth/Marigold) -->
27
- <!-- [![HF Space](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Space-blue)]() -->
28
- <!-- [![Docker](doc/badges/badge-docker.svg)]() -->
29
 
30
  [Haruko386](https://haruko386.github.io/),
31
  [Shuai Yuan](https://syjz.teacher.360eol.com/teacherBasic/preview?teacherId=23776)
@@ -35,7 +16,8 @@ This repository is based on [Marigold](https://marigoldmonodepth.github.io), CVP
35
  >We present **ApDepth**, a diffusion model, and associated fine-tuning protocol for monocular depth estimation. Based on Marigold. Its core innovation lies in addressing the deficiency of diffusion models in feature representation capability. Our model followed Marigold, derived from Stable Diffusion and fine-tuned with synthetic data: Hypersim and VKitti, achieved ideal results in object edge refinement.
36
 
37
  ## ๐Ÿ“ข News
38
- - 2025-09-23: We change Marigold from `Stochastic multi-step generation` to `Deterministic one-step perception`
 
39
  - 2025-08-10: Trying to make some optimizations in Feature Expression<br>
40
  - 2025-05-08: Clone Marigold to local.<br>
41
 
@@ -43,7 +25,7 @@ This repository is based on [Marigold](https://marigoldmonodepth.github.io), CVP
43
 
44
  **We offer several ways to interact with Marigold**:
45
 
46
- 1. A free online interactive demo is available here: <a href="https://huggingface.co/spaces/prs-eth/marigold-lcm"><img src="https://img.shields.io/badge/๐Ÿค—%20Hugging%20Face%20(LCM)-Space-yellow" height="16"></a> (kudos to the HF team for the GPU grant)
47
 
48
  2. If you just want to see the examples, visit our gallery: <a href="https://haruko386.github.io/research"><img src="doc/badges/badge-website.svg" height="16"></a>
49
 
@@ -77,8 +59,8 @@ cd ApDepth
77
  **Using Conda:**
78
  Alternatively, create a Python native virtual environment and install dependencies into it:
79
 
80
- conda create -n marigold python==3.12.9
81
- conda activate marigold
82
  pip install -r requirements.txt
83
 
84
  Keep the environment activated before running the inference script.
@@ -88,25 +70,11 @@ Activate the environment again after restarting the terminal session.
88
 
89
  ### ๐Ÿ“ท Prepare images
90
 
91
- 1. Use selected images from our paper:
92
 
93
- ```bash
94
- bash script/download_sample_data.sh
95
- ```
96
 
97
- 1. Or place your images in a directory, for example, under `input/in-the-wild_example`, and run the following inference command.
98
-
99
- ### ๐Ÿš€ Run inference with LCM (faster)
100
-
101
- The [LCM checkpoint](https://huggingface.co/prs-eth/marigold-lcm-v1-0) is distilled from our original checkpoint towards faster inference speed (by reducing inference steps). The inference steps can be as few as 1 (default) to 4. Run with default LCM setting:
102
-
103
- ```bash
104
- python run.py \
105
- --input_rgb_dir input/in-the-wild_example \
106
- --output_dir output/in-the-wild_example_lcm
107
- ```
108
-
109
- ### ๐ŸŽฎ Run inference with DDIM (paper setting)
110
 
111
  This setting corresponds to our paper. For academic comparison, please run with this setting.
112
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  # ApDepth: Aiming for Precise Monocular Depth Estimation Based on Diffusion Models
2
 
3
  This repository is based on [Marigold](https://marigoldmonodepth.github.io), CVPR 2024 Best Paper: [**Repurposing Diffusion-Based Image Generators for Monocular Depth Estimation**](https://arxiv.org/abs/2312.02145)
4
 
 
 
 
 
 
5
  [![Website](doc/badges/badge-website.svg)](https://haruko386.github.io/research)
6
  [![License](https://img.shields.io/badge/License-Apache--2.0-929292)](https://www.apache.org/licenses/LICENSE-2.0)
7
  [![Static Badge](https://img.shields.io/badge/build-Haruko386-brightgreen?style=flat&logo=steam&logoColor=white&logoSize=auto&label=steam&labelColor=black&color=gray&cacheSeconds=3600)](https://steamcommunity.com/profiles/76561198217881431/)
8
+ [![Hugging Face Model](https://img.shields.io/badge/๐Ÿค—%20Hugging%20Face-Model-green)](https://huggingface.co/developy/ApDepth)
9
+ [![Hugging Face Demo](https://img.shields.io/badge/๐Ÿค—%20Hugging%20Face-Demo-purple)](https://huggingface.co/spaces/developy/ApDepth)
 
 
 
10
 
11
  [Haruko386](https://haruko386.github.io/),
12
  [Shuai Yuan](https://syjz.teacher.360eol.com/teacherBasic/preview?teacherId=23776)
 
16
  >We present **ApDepth**, a diffusion model, and associated fine-tuning protocol for monocular depth estimation. Based on Marigold. Its core innovation lies in addressing the deficiency of diffusion models in feature representation capability. Our model followed Marigold, derived from Stable Diffusion and fine-tuned with synthetic data: Hypersim and VKitti, achieved ideal results in object edge refinement.
17
 
18
  ## ๐Ÿ“ข News
19
+ - 2025-10-09: We propose a novel diffusion-based deep estimation framework guided by pre-trained models.
20
+ - 2025-09-23: We change Marigold from **Stochastic multi-step generation** to **Deterministic one-step perception**
21
  - 2025-08-10: Trying to make some optimizations in Feature Expression<br>
22
  - 2025-05-08: Clone Marigold to local.<br>
23
 
 
25
 
26
  **We offer several ways to interact with Marigold**:
27
 
28
+ 1. A free online interactive demo is available here: <a href="https://huggingface.co/spaces/developy/ApDepth"><img src="https://img.shields.io/badge/๐Ÿค—%20Hugging%20Face-Demo-purple" height="18"></a>
29
 
30
  2. If you just want to see the examples, visit our gallery: <a href="https://haruko386.github.io/research"><img src="doc/badges/badge-website.svg" height="16"></a>
31
 
 
59
  **Using Conda:**
60
  Alternatively, create a Python native virtual environment and install dependencies into it:
61
 
62
+ conda create -n apdepth python==3.12.9
63
+ conda activate apdepth
64
  pip install -r requirements.txt
65
 
66
  Keep the environment activated before running the inference script.
 
70
 
71
  ### ๐Ÿ“ท Prepare images
72
 
73
+ 1. Use selected images under `input`
74
 
75
+ 1. Or place your images in a directory, for example, under `input/test-image`, and run the following inference command.
 
 
76
 
77
+ ### ๐ŸŽฎ Run inference with paper setting
 
 
 
 
 
 
 
 
 
 
 
 
78
 
79
  This setting corresponds to our paper. For academic comparison, please run with this setting.
80