Skip to content

Commit

Permalink
Update InternVL2.5
Browse files Browse the repository at this point in the history
  • Loading branch information
JohnTang93 authored Dec 13, 2024
1 parent 493b31c commit 427a74a
Showing 1 changed file with 19 additions and 5 deletions.
24 changes: 19 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ MTVQA: Benchmarking Multilingual Text-Centric Visual Question Answering
|**[🍎 Project Page](https://bytedance.github.io/MTVQA/)** | **[📖 Paper](https://arxiv.org/abs/2405.11985)** |**[📊 Dataset](https://huggingface.co/datasets/ByteDance/MTVQA)** | **[🏆 Leaderboard](https://github.com/bytedance/MTVQA?tab=readme-ov-file#-leaderboard)**

## 🔥 News

* **`2024.12.12`** 🌟 InternVL2.5 tests its performance on MTVQA, InternVL2.5 78B model outperforms Qwen2VL 76B and achieves the SOTA performance, congratulations to the [InternVL2.5](https://github.com/OpenGVLab/InternVL?tab=readme-ov-file) team!
* **`2024.09.29`** 🌟 The Blue LM team from VIVO tests their BlueLM-V-3B performance on MTVQA. BlueLM-V-3B achieves performance comparable to GPT-4o. and it ranks the third place among all the SOTA MLLMs!
* **`2024.09.09`** 🌟 We test GPT-4o mini's performance on MTVQA and it performs exceptionally well among the leading lightweight MLLMs!
* **`2024.09.04`** 🌟 InternVL2 tests its performance on MTVQA, InternVL2 76B model outperforms GPT-4V, thanks to the [InternVL2](https://internvl.readthedocs.io/en/latest/internvl2.0/evaluation.html#mtvqa) team.
Expand Down Expand Up @@ -45,9 +45,23 @@ If you want to add your results to the MTVQA leaderboard, feel free to email us
<td><b>VI</b></td>
<td><b>Average</b> </td>
</tr>
<tr>
<th align="left"><font color=darkred>InternVL2.5 78B🥇</font></th>
<td align="middle">✅ </td>
<td>- </td>
<td>- </td>
<td>- </td>
<td>- </td>
<td>- </td>
<td>- </td>
<td>- </td>
<td>- </td>
<td>- </td>
<td><font color=red><b>31.9</b></font></td>
</tr>
<tr>
<th align="left"><font color=darkred>Qwen2-VL 72B🥇</font></th>
<td align="middle"></td>
<th align="left"><font color=darkred>Qwen2-VL 72B🥈</font></th>
<td align="middle"></td>
<td>20.7 </td>
<td>36.5 </td>
<td>44.1 </td>
Expand All @@ -60,7 +74,7 @@ If you want to add your results to the MTVQA leaderboard, feel free to email us
<td><font color=red><b>30.9</b></font></td>
</tr>
<tr>
<th align="left"><font color=darkred>GPT-4o 🥈 </font></th>
<th align="left"><font color=darkred>GPT-4o 🥉 </font></th>
<td align="middle">✘</td>
<td>20.2 </td>
<td>34.2 </td>
Expand All @@ -74,7 +88,7 @@ If you want to add your results to the MTVQA leaderboard, feel free to email us
<td><font color=red><b>27.8</b></font></td>
</tr>
<tr>
<th align="left"><font color=darkred>BlueLM-V-3B 🥉 </font></th>
<th align="left"><font color=darkred>BlueLM-V-3B </font></th>
<td align="middle">✘</td>
<td>17.3 </td>
<td>39.5 </td>
Expand Down

0 comments on commit 427a74a

Please sign in to comment.