mirror of
https://github.com/THU-MIG/yolov10.git
synced 2025-05-23 13:34:23 +08:00
Add docs Ultralytics 文档: - zh/index.md
(#5871)
Signed-off-by: Glenn Jocher <glenn.jocher@ultralytics.com> Co-authored-by: Laughing-q <1185102784@qq.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
This commit is contained in:
parent
b9b0fd8bf4
commit
0f9f857449
4
.github/workflows/links.yml
vendored
4
.github/workflows/links.yml
vendored
@ -28,7 +28,7 @@ jobs:
|
|||||||
timeout_minutes: 5
|
timeout_minutes: 5
|
||||||
retry_wait_seconds: 60
|
retry_wait_seconds: 60
|
||||||
max_attempts: 3
|
max_attempts: 3
|
||||||
command: lychee --accept 429,999 --exclude-loopback --exclude 'https?://(www\.)?(linkedin\.com|twitter\.com|instagram\.com|kaggle\.com)' --exclude-path '**/ci.yaml' --exclude-mail --github-token ${{ secrets.GITHUB_TOKEN }} './**/*.md' './**/*.html'
|
command: lychee --accept 429,999 --exclude-loopback --exclude 'https?://(www\.)?(linkedin\.com|twitter\.com|instagram\.com|kaggle\.com|fonts\.gstatic\.com|url\.com)' --exclude-path '**/ci.yaml' --exclude-mail --github-token ${{ secrets.GITHUB_TOKEN }} './**/*.md' './**/*.html'
|
||||||
|
|
||||||
- name: Test Markdown, HTML, YAML, Python and Notebook links with retry
|
- name: Test Markdown, HTML, YAML, Python and Notebook links with retry
|
||||||
if: github.event_name == 'workflow_dispatch'
|
if: github.event_name == 'workflow_dispatch'
|
||||||
@ -37,4 +37,4 @@ jobs:
|
|||||||
timeout_minutes: 5
|
timeout_minutes: 5
|
||||||
retry_wait_seconds: 60
|
retry_wait_seconds: 60
|
||||||
max_attempts: 3
|
max_attempts: 3
|
||||||
command: lychee --accept 429,999 --exclude-loopback --exclude 'https?://(www\.)?(linkedin\.com|twitter\.com|instagram\.com|kaggle\.com|url\.com)' --exclude-path '**/ci.yaml' --exclude-mail --github-token ${{ secrets.GITHUB_TOKEN }} './**/*.md' './**/*.html' './**/*.yml' './**/*.yaml' './**/*.py' './**/*.ipynb'
|
command: lychee --accept 429,999 --exclude-loopback --exclude 'https?://(www\.)?(linkedin\.com|twitter\.com|instagram\.com|kaggle\.com|url\.com|fonts\.gstatic\.com|url\.com)' --exclude-path '**/ci.yaml' --exclude-mail --github-token ${{ secrets.GITHUB_TOKEN }} './**/*.md' './**/*.html' './**/*.yml' './**/*.yaml' './**/*.py' './**/*.ipynb'
|
||||||
|
@ -1,33 +1,32 @@
|
|||||||
---
|
|
||||||
description: Learn how to install Ultralytics in developer mode, build and serve it locally for testing, and deploy your documentation site on platforms like GitHub Pages, GitLab Pages, and Amazon S3.
|
|
||||||
keywords: Ultralytics, documentation, mkdocs, installation, developer mode, building, deployment, local server, GitHub Pages, GitLab Pages, Amazon S3
|
|
||||||
---
|
|
||||||
|
|
||||||
# Ultralytics Docs
|
# Ultralytics Docs
|
||||||
|
|
||||||
Ultralytics Docs are deployed to [https://docs.ultralytics.com](https://docs.ultralytics.com).
|
Ultralytics Docs are deployed to [https://docs.ultralytics.com](https://docs.ultralytics.com).
|
||||||
|
|
||||||
|
[](https://github.com/ultralytics/docs/actions/workflows/pages/pages-build-deployment) [](https://github.com/ultralytics/docs/actions/workflows/links.yml)
|
||||||
|
|
||||||
### Install Ultralytics package
|
### Install Ultralytics package
|
||||||
|
|
||||||
|
[](https://badge.fury.io/py/ultralytics) [](https://pepy.tech/project/ultralytics)
|
||||||
|
|
||||||
To install the ultralytics package in developer mode, you will need to have Git and Python 3 installed on your system. Then, follow these steps:
|
To install the ultralytics package in developer mode, you will need to have Git and Python 3 installed on your system. Then, follow these steps:
|
||||||
|
|
||||||
1. Clone the ultralytics repository to your local machine using Git:
|
1. Clone the ultralytics repository to your local machine using Git:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
git clone https://github.com/ultralytics/ultralytics.git
|
git clone https://github.com/ultralytics/ultralytics.git
|
||||||
```
|
```
|
||||||
|
|
||||||
2. Navigate to the root directory of the repository:
|
2. Navigate to the root directory of the repository:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
cd ultralytics
|
cd ultralytics
|
||||||
```
|
```
|
||||||
|
|
||||||
3. Install the package in developer mode using pip:
|
3. Install the package in developer mode using pip:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
pip install -e ".[dev]"
|
pip install -e '.[dev]'
|
||||||
```
|
```
|
||||||
|
|
||||||
This will install the ultralytics package and its dependencies in developer mode, allowing you to make changes to the package code and have them reflected immediately in your Python environment.
|
This will install the ultralytics package and its dependencies in developer mode, allowing you to make changes to the package code and have them reflected immediately in your Python environment.
|
||||||
|
|
||||||
|
@ -22,13 +22,13 @@ Here's a brief description of our CI actions:
|
|||||||
|
|
||||||
Below is the table showing the status of these CI tests for our main repositories:
|
Below is the table showing the status of these CI tests for our main repositories:
|
||||||
|
|
||||||
| Repository | CI | Docker Deployment | Broken Links | CodeQL | PyPi and Docs Publishing |
|
| Repository | CI | Docker Deployment | Broken Links | CodeQL | PyPi and Docs Publishing |
|
||||||
|-----------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
|-----------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||||
| [yolov3](https://github.com/ultralytics/yolov3) | [](https://github.com/ultralytics/yolov3/actions/workflows/ci-testing.yml) | [](https://github.com/ultralytics/yolov3/actions/workflows/docker.yml) | [](https://github.com/ultralytics/yolov3/actions/workflows/links.yml) | [](https://github.com/ultralytics/yolov3/actions/workflows/codeql-analysis.yml) | |
|
| [yolov3](https://github.com/ultralytics/yolov3) | [](https://github.com/ultralytics/yolov3/actions/workflows/ci-testing.yml) | [](https://github.com/ultralytics/yolov3/actions/workflows/docker.yml) | [](https://github.com/ultralytics/yolov3/actions/workflows/links.yml) | [](https://github.com/ultralytics/yolov3/actions/workflows/codeql-analysis.yml) | |
|
||||||
| [yolov5](https://github.com/ultralytics/yolov5) | [](https://github.com/ultralytics/yolov5/actions/workflows/ci-testing.yml) | [](https://github.com/ultralytics/yolov5/actions/workflows/docker.yml) | [](https://github.com/ultralytics/yolov5/actions/workflows/links.yml) | [](https://github.com/ultralytics/yolov5/actions/workflows/codeql-analysis.yml) | |
|
| [yolov5](https://github.com/ultralytics/yolov5) | [](https://github.com/ultralytics/yolov5/actions/workflows/ci-testing.yml) | [](https://github.com/ultralytics/yolov5/actions/workflows/docker.yml) | [](https://github.com/ultralytics/yolov5/actions/workflows/links.yml) | [](https://github.com/ultralytics/yolov5/actions/workflows/codeql-analysis.yml) | |
|
||||||
| [ultralytics](https://github.com/ultralytics/ultralytics) | [](https://github.com/ultralytics/ultralytics/actions/workflows/ci.yaml) | [](https://github.com/ultralytics/ultralytics/actions/workflows/docker.yaml) | [](https://github.com/ultralytics/ultralytics/actions/workflows/links.yml) | [](https://github.com/ultralytics/ultralytics/actions/workflows/codeql.yaml) | [](https://github.com/ultralytics/ultralytics/actions/workflows/publish.yml) |
|
| [ultralytics](https://github.com/ultralytics/ultralytics) | [](https://github.com/ultralytics/ultralytics/actions/workflows/ci.yaml) | [](https://github.com/ultralytics/ultralytics/actions/workflows/docker.yaml) | [](https://github.com/ultralytics/ultralytics/actions/workflows/links.yml) | [](https://github.com/ultralytics/ultralytics/actions/workflows/codeql.yaml) | [](https://github.com/ultralytics/ultralytics/actions/workflows/publish.yml) |
|
||||||
| [hub](https://github.com/ultralytics/hub) | [](https://github.com/ultralytics/hub/actions/workflows/ci.yaml) | | [](https://github.com/ultralytics/hub/actions/workflows/links.yml) | | |
|
| [hub](https://github.com/ultralytics/hub) | [](https://github.com/ultralytics/hub/actions/workflows/ci.yaml) | | [](https://github.com/ultralytics/hub/actions/workflows/links.yml) | | |
|
||||||
| [docs](https://github.com/ultralytics/docs) | | | | | [](https://github.com/ultralytics/docs/actions/workflows/pages/pages-build-deployment) |
|
| [docs](https://github.com/ultralytics/docs) | | | | | [](https://github.com/ultralytics/docs/actions/workflows/pages/pages-build-deployment) [](https://github.com/ultralytics/docs/actions/workflows/links.yml) |
|
||||||
|
|
||||||
Each badge shows the status of the last run of the corresponding CI test on the `main` branch of the respective repository. If a test fails, the badge will display a "failing" status, and if it passes, it will display a "passing" status.
|
Each badge shows the status of the last run of the corresponding CI test on the `main` branch of the respective repository. If a test fails, the badge will display a "failing" status, and if it passes, it will display a "passing" status.
|
||||||
|
|
||||||
|
@ -35,13 +35,13 @@ Different delegates are available on Android devices to accelerate model inferen
|
|||||||
|
|
||||||
Here's a table showing the primary vendors, their product lines, popular devices, and supported delegates:
|
Here's a table showing the primary vendors, their product lines, popular devices, and supported delegates:
|
||||||
|
|
||||||
| Vendor | Product Lines | Popular Devices | Delegates Supported |
|
| Vendor | Product Lines | Popular Devices | Delegates Supported |
|
||||||
|-----------------------------------------|---------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------|
|
|-----------------------------------------|--------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------|
|
||||||
| [Qualcomm](https://www.qualcomm.com/) | [Snapdragon (e.g., 800 series)](https://www.qualcomm.com/snapdragon) | [Samsung Galaxy S21](https://www.samsung.com/global/galaxy/galaxy-s21-5g/), [OnePlus 9](https://www.oneplus.com/9), [Google Pixel 6](https://store.google.com/product/pixel_6) | CPU, GPU, Hexagon, NNAPI |
|
| [Qualcomm](https://www.qualcomm.com/) | [Snapdragon (e.g., 800 series)](https://www.qualcomm.com/snapdragon) | [Samsung Galaxy S21](https://www.samsung.com/global/galaxy/galaxy-s21-5g/), [OnePlus 9](https://www.oneplus.com/9), [Google Pixel 6](https://store.google.com/product/pixel_6) | CPU, GPU, Hexagon, NNAPI |
|
||||||
| [Samsung](https://www.samsung.com/) | [Exynos (e.g., Exynos 2100)](https://www.samsung.com/semiconductor/minisite/exynos/) | [Samsung Galaxy S21 (Global version)](https://www.samsung.com/global/galaxy/galaxy-s21-5g/) | CPU, GPU, NNAPI |
|
| [Samsung](https://www.samsung.com/) | [Exynos (e.g., Exynos 2100)](https://www.samsung.com/semiconductor/minisite/exynos/) | [Samsung Galaxy S21 (Global version)](https://www.samsung.com/global/galaxy/galaxy-s21-5g/) | CPU, GPU, NNAPI |
|
||||||
| [MediaTek](https://www.mediatek.com/) | [Dimensity (e.g., Dimensity 1200)](https://www.mediatek.com/products/smartphones) | [Realme GT](https://www.realme.com/global/realme-gt), [Xiaomi Redmi Note](https://www.mi.com/en/phone/redmi/note-list) | CPU, GPU, NNAPI |
|
| [MediaTek](https://www.mediatek.com/) | [Dimensity (e.g., Dimensity 1200)](https://www.mediatek.com/products/smartphones) | [Realme GT](https://www.realme.com/global/realme-gt), [Xiaomi Redmi Note](https://www.mi.com/en/phone/redmi/note-list) | CPU, GPU, NNAPI |
|
||||||
| [HiSilicon](https://www.hisilicon.com/) | [Kirin (e.g., Kirin 990)](https://www.hisilicon.com/en/products/Kirin) | [Huawei P40 Pro](https://consumer.huawei.com/en/phones/p40-pro/), [Huawei Mate 30 Pro](https://consumer.huawei.com/en/phones/mate30-pro/) | CPU, GPU, NNAPI |
|
| [HiSilicon](https://www.hisilicon.com/) | [Kirin (e.g., Kirin 990)](https://www.hisilicon.com/en/products/Kirin) | [Huawei P40 Pro](https://consumer.huawei.com/en/phones/p40-pro/), [Huawei Mate 30 Pro](https://consumer.huawei.com/en/phones/mate30-pro/) | CPU, GPU, NNAPI |
|
||||||
| [NVIDIA](https://www.nvidia.com/) | [Tegra (e.g., Tegra X1)](https://www.nvidia.com/en-us/autonomous-machines/embedded-systems-dev-kits-modules/) | [NVIDIA Shield TV](https://www.nvidia.com/en-us/shield/shield-tv/), [Nintendo Switch](https://www.nintendo.com/switch/) | CPU, GPU, NNAPI |
|
| [NVIDIA](https://www.nvidia.com/) | [Tegra (e.g., Tegra X1)](https://developer.nvidia.com/content/tegra-x1) | [NVIDIA Shield TV](https://www.nvidia.com/en-us/shield/shield-tv/), [Nintendo Switch](https://www.nintendo.com/switch/) | CPU, GPU, NNAPI |
|
||||||
|
|
||||||
Please note that the list of devices mentioned is not exhaustive and may vary depending on the specific chipsets and device models. Always test your models on your target devices to ensure compatibility and optimal performance.
|
Please note that the list of devices mentioned is not exhaustive and may vary depending on the specific chipsets and device models. Always test your models on your target devices to ensure compatibility and optimal performance.
|
||||||
|
|
||||||
|
@ -37,3 +37,8 @@ div.highlight {
|
|||||||
max-height: 20rem;
|
max-height: 20rem;
|
||||||
overflow-y: auto; /* for adding a scrollbar when needed */
|
overflow-y: auto; /* for adding a scrollbar when needed */
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/* Set content width */
|
||||||
|
.md-grid {
|
||||||
|
max-width: 1440px;
|
||||||
|
}
|
||||||
|
69
docs/zh/index.md
Normal file
69
docs/zh/index.md
Normal file
@ -0,0 +1,69 @@
|
|||||||
|
---
|
||||||
|
comments: true
|
||||||
|
description: 探索Ultralytics YOLOv8的完整指南,这是一个高速、高精度的目标检测和图像分割模型。包括安装、预测、训练教程等。
|
||||||
|
keywords: Ultralytics, YOLOv8, 目标检测, 图像分割, 机器学习, 深度学习, 计算机视觉, YOLOv8安装, YOLOv8预测, YOLOv8训练, YOLO历史, YOLO许可
|
||||||
|
---
|
||||||
|
|
||||||
|
# Ultralytics 中文文档
|
||||||
|
|
||||||
|
<div align="center">
|
||||||
|
<p>
|
||||||
|
<a href="https://yolovision.ultralytics.com" target="_blank">
|
||||||
|
<img width="1024" src="https://raw.githubusercontent.com/ultralytics/assets/main/yolov8/banner-yolov8.png"></a>
|
||||||
|
</p>
|
||||||
|
<a href="https://github.com/ultralytics/ultralytics/actions/workflows/ci.yaml"><img src="https://github.com/ultralytics/ultralytics/actions/workflows/ci.yaml/badge.svg" alt="Ultralytics CI"></a>
|
||||||
|
<a href="https://codecov.io/github/ultralytics/ultralytics"><img src="https://codecov.io/github/ultralytics/ultralytics/branch/main/graph/badge.svg?token=HHW7IIVFVY" alt="Ultralytics Code Coverage"></a>
|
||||||
|
<a href="https://zenodo.org/badge/latestdoi/264818686"><img src="https://zenodo.org/badge/264818686.svg" alt="YOLOv8 Citation"></a>
|
||||||
|
<a href="https://hub.docker.com/r/ultralytics/ultralytics"><img src="https://img.shields.io/docker/pulls/ultralytics/ultralytics?logo=docker" alt="Docker Pulls"></a>
|
||||||
|
<br>
|
||||||
|
<a href="https://console.paperspace.com/github/ultralytics/ultralytics"><img src="https://assets.paperspace.io/img/gradient-badge.svg" alt="Run on Gradient"/></a>
|
||||||
|
<a href="https://colab.research.google.com/github/ultralytics/ultralytics/blob/main/examples/tutorial.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"></a>
|
||||||
|
<a href="https://www.kaggle.com/ultralytics/yolov8"><img src="https://kaggle.com/static/images/open-in-kaggle.svg" alt="Open In Kaggle"></a>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
介绍 [Ultralytics](https://ultralytics.com) [YOLOv8](https://github.com/ultralytics/ultralytics),这是备受好评的实时目标检测和图像分割模型的最新版本。YOLOv8基于深度学习和计算机视觉的前沿进展,提供了无与伦比的速度和准确性表现。它的精简设计使其适用于各种应用,并且可以轻松适应不同的硬件平台,从边缘设备到云API。
|
||||||
|
|
||||||
|
探索YOLOv8文档,这是一个全面的资源,旨在帮助您理解并利用其功能和能力。无论您是经验丰富的机器学习从业者还是新入行者,该中心旨在最大化YOLOv8在您的项目中的潜力。
|
||||||
|
|
||||||
|
## 从哪里开始
|
||||||
|
|
||||||
|
- **安装** `ultralytics` 并通过 pip 在几分钟内开始运行 [:material-clock-fast: 开始使用](https://docs.ultralytics.com/quickstart/){ .md-button }
|
||||||
|
- **预测** 使用YOLOv8预测新的图像和视频 [:octicons-image-16: 在图像上预测](https://docs.ultralytics.com/predict/){ .md-button }
|
||||||
|
- **训练** 在您自己的自定义数据集上训练新的YOLOv8模型 [:fontawesome-solid-brain: 训练模型](https://docs.ultralytics.com/train/){ .md-button }
|
||||||
|
- **探索** YOLOv8的任务,如分割、分类、姿态和跟踪 [:material-magnify-expand: 探索任务](https://docs.ultralytics.com/tasks/){ .md-button }
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
<br>
|
||||||
|
<iframe width="720" height="405" src="https://www.youtube.com/embed/LNwODJXcvt4?si=7n1UvGRLSd9p5wKs"
|
||||||
|
title="YouTube video player" frameborder="0"
|
||||||
|
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share"
|
||||||
|
allowfullscreen>
|
||||||
|
</iframe>
|
||||||
|
<br>
|
||||||
|
<strong>观看:</strong> 在Google Colab中如何训练您的自定义数据集上的YOLOv8模型。
|
||||||
|
</p>
|
||||||
|
|
||||||
|
## YOLO:简史
|
||||||
|
|
||||||
|
[YOLO](https://arxiv.org/abs/1506.02640) (You Only Look Once),由华盛顿大学的Joseph Redmon和Ali Farhadi开发的流行目标检测和图像分割模型,于2015年推出,由于其高速和准确性而迅速流行。
|
||||||
|
|
||||||
|
- [YOLOv2](https://arxiv.org/abs/1612.08242) 在2016年发布,通过引入批量归一化、锚框和维度聚类来改进了原始模型。
|
||||||
|
- [YOLOv3](https://pjreddie.com/media/files/papers/YOLOv3.pdf) 在2018年推出,进一步增强了模型的性能,使用了更高效的主干网络、多个锚点和空间金字塔池化。
|
||||||
|
- [YOLOv4](https://arxiv.org/abs/2004.10934) 在2020年发布,引入了Mosaic数据增强、新的无锚检测头和新的损失函数等创新功能。
|
||||||
|
- [YOLOv5](https://github.com/ultralytics/yolov5) 进一步改进了模型的性能,并增加了新功能,如超参数优化、集成实验跟踪和自动导出到常用的导出格式。
|
||||||
|
- [YOLOv6](https://github.com/meituan/YOLOv6) 在2022年由[美团](https://about.meituan.com/)开源,现在正在该公司的许多自动送货机器人中使用。
|
||||||
|
- [YOLOv7](https://github.com/WongKinYiu/yolov7) 在COCO关键点数据集上添加了额外的任务,如姿态估计。
|
||||||
|
- [YOLOv8](https://github.com/ultralytics/ultralytics) 是Ultralytics的YOLO的最新版本。作为一种前沿、最先进(SOTA)的模型,YOLOv8在之前版本的成功基础上引入了新功能和改进,以提高性能、灵活性和效率。YOLOv8支持全范围的视觉AI任务,包括[检测](https://docs.ultralytics.com/tasks/detect/), [分割](https://docs.ultralytics.com/tasks/segment/), [姿态估计](https://docs.ultralytics.com/tasks/pose/), [跟踪](https://docs.ultralytics.com/modes/track/), 和[分类](https://docs.ultralytics.com/tasks/classify/)。这种多功能性使用户能够利用YOLOv8的功能应对多种应用和领域的需求。
|
||||||
|
|
||||||
|
## YOLO许可证:Ultralytics YOLO是如何授权的?
|
||||||
|
|
||||||
|
Ultralytics提供两种许可选项以适应不同的使用场景:
|
||||||
|
|
||||||
|
- **AGPL-3.0许可证**:这种[OSI-approved](https://opensource.org/licenses/)开源许可证非常适合学生和爱好者,促进了开放的合作和知识共享。更多详细信息请参阅[LICENSE](https://github.com/ultralytics/ultralytics/blob/main/LICENSE)文件。
|
||||||
|
- **企业许可证**:这种许可证设计用于商业用途,允许将Ultralytics软件和AI模型无缝集成到商业商品和服务中,绕过AGPL-3.0的开源要求。如果您的场景涉及将我们的解决方案嵌入到商业产品中,请通过[Ultralytics Licensing](https://ultralytics.com/license)联系我们。
|
||||||
|
|
||||||
|
我们的授权策略旨在确保我们的开源项目的任何改进都能回馈到社区。我们十分珍视开源原则❤️,我们的使命是确保我们的贡献能够以对所有人有益的方式被利用和拓展。
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**注意**:我们正在努力为我们的文档页面提供中文文档,并希望在接下来的几个月内发布。请密切关注我们的更新,并感谢您的耐心等待。
|
34
mkdocs.yml
34
mkdocs.yml
@ -11,6 +11,7 @@ remote_name: https://github.com/ultralytics/docs
|
|||||||
|
|
||||||
theme:
|
theme:
|
||||||
name: material
|
name: material
|
||||||
|
language: en
|
||||||
custom_dir: docs/overrides
|
custom_dir: docs/overrides
|
||||||
logo: https://github.com/ultralytics/assets/raw/main/logo/Ultralytics_Logotype_Reverse.svg
|
logo: https://github.com/ultralytics/assets/raw/main/logo/Ultralytics_Logotype_Reverse.svg
|
||||||
favicon: assets/favicon.ico
|
favicon: assets/favicon.ico
|
||||||
@ -25,14 +26,14 @@ theme:
|
|||||||
- scheme: default
|
- scheme: default
|
||||||
# primary: grey
|
# primary: grey
|
||||||
toggle:
|
toggle:
|
||||||
icon: material/brightness-7
|
icon: material/toggle-switch
|
||||||
name: Switch to dark mode
|
name: Switch to dark mode
|
||||||
|
|
||||||
# Palette toggle for dark mode
|
# Palette toggle for dark mode
|
||||||
- scheme: slate
|
- scheme: slate
|
||||||
# primary: black
|
# primary: black
|
||||||
toggle:
|
toggle:
|
||||||
icon: material/brightness-4
|
icon: material/toggle-switch-off-outline
|
||||||
name: Switch to light mode
|
name: Switch to light mode
|
||||||
features:
|
features:
|
||||||
- announce.dismiss
|
- announce.dismiss
|
||||||
@ -44,15 +45,17 @@ theme:
|
|||||||
- search.share
|
- search.share
|
||||||
- search.suggest
|
- search.suggest
|
||||||
- toc.follow
|
- toc.follow
|
||||||
- toc.integrate
|
# - toc.integrate
|
||||||
- navigation.top
|
- navigation.top
|
||||||
- navigation.tabs
|
- navigation.tabs
|
||||||
- navigation.tabs.sticky
|
- navigation.tabs.sticky
|
||||||
- navigation.expand
|
- navigation.prune
|
||||||
- navigation.footer
|
- navigation.footer
|
||||||
- navigation.tracking
|
- navigation.tracking
|
||||||
- navigation.instant
|
- navigation.instant
|
||||||
|
- navigation.instant.progress
|
||||||
- navigation.indexes
|
- navigation.indexes
|
||||||
|
- navigation.sections
|
||||||
- content.tabs.link # all code tabs change simultaneously
|
- content.tabs.link # all code tabs change simultaneously
|
||||||
|
|
||||||
# Customization
|
# Customization
|
||||||
@ -64,20 +67,13 @@ extra:
|
|||||||
analytics:
|
analytics:
|
||||||
provider: google
|
provider: google
|
||||||
property: G-2M5EHKC0BH
|
property: G-2M5EHKC0BH
|
||||||
# feedback:
|
alternate: # language drop-down
|
||||||
# title: Was this page helpful?
|
- name: English
|
||||||
# ratings:
|
link: /
|
||||||
# - icon: material/heart
|
lang: en
|
||||||
# name: This page was helpful
|
- name: 简体中文
|
||||||
# data: 1
|
link: /zh/
|
||||||
# note: Thanks for your feedback!
|
lang: zh
|
||||||
# - icon: material/heart-broken
|
|
||||||
# name: This page could be improved
|
|
||||||
# data: 0
|
|
||||||
# note: >-
|
|
||||||
# Thanks for your feedback!<br>
|
|
||||||
# <a href="https://github.com/ultralytics/ultralytics/issues/new?title=Docs+Feedback+for+{title}+page+at+https://docs.ultralytics.com/{url}&labels=enhancement&template=feature-request.yml" target="_blank" rel="noopener">Tell us what we can improve.</a>
|
|
||||||
|
|
||||||
social:
|
social:
|
||||||
- icon: fontawesome/brands/github
|
- icon: fontawesome/brands/github
|
||||||
link: https://github.com/ultralytics
|
link: https://github.com/ultralytics
|
||||||
@ -145,6 +141,8 @@ nav:
|
|||||||
- Segment: tasks/segment.md
|
- Segment: tasks/segment.md
|
||||||
- Classify: tasks/classify.md
|
- Classify: tasks/classify.md
|
||||||
- Pose: tasks/pose.md
|
- Pose: tasks/pose.md
|
||||||
|
- Ultralytics 文档:
|
||||||
|
- zh/index.md
|
||||||
- Quickstart: quickstart.md
|
- Quickstart: quickstart.md
|
||||||
- Modes:
|
- Modes:
|
||||||
- modes/index.md
|
- modes/index.md
|
||||||
|
2
setup.py
2
setup.py
@ -75,7 +75,7 @@ setup(
|
|||||||
'mkdocs-material',
|
'mkdocs-material',
|
||||||
'mkdocstrings[python]',
|
'mkdocstrings[python]',
|
||||||
'mkdocs-redirects', # for 301 redirects
|
'mkdocs-redirects', # for 301 redirects
|
||||||
'mkdocs-ultralytics-plugin>=0.0.29', # for meta descriptions and images, dates and authors
|
'mkdocs-ultralytics-plugin>=0.0.30', # for meta descriptions and images, dates and authors
|
||||||
],
|
],
|
||||||
'export': [
|
'export': [
|
||||||
'coremltools>=7.0',
|
'coremltools>=7.0',
|
||||||
|
@ -267,20 +267,28 @@ class BaseDataset(Dataset):
|
|||||||
return label
|
return label
|
||||||
|
|
||||||
def build_transforms(self, hyp=None):
|
def build_transforms(self, hyp=None):
|
||||||
"""Users can custom augmentations here
|
"""
|
||||||
like:
|
Users can customize augmentations here.
|
||||||
|
|
||||||
|
Example:
|
||||||
|
```python
|
||||||
if self.augment:
|
if self.augment:
|
||||||
# Training transforms
|
# Training transforms
|
||||||
return Compose([])
|
return Compose([])
|
||||||
else:
|
else:
|
||||||
# Val transforms
|
# Val transforms
|
||||||
return Compose([])
|
return Compose([])
|
||||||
|
```
|
||||||
"""
|
"""
|
||||||
raise NotImplementedError
|
raise NotImplementedError
|
||||||
|
|
||||||
def get_labels(self):
|
def get_labels(self):
|
||||||
"""Users can custom their own format here.
|
"""
|
||||||
Make sure your output is a list with each element like below:
|
Users can customize their own format here.
|
||||||
|
|
||||||
|
Note:
|
||||||
|
Ensure output is a dictionary with the following keys:
|
||||||
|
```python
|
||||||
dict(
|
dict(
|
||||||
im_file=im_file,
|
im_file=im_file,
|
||||||
shape=shape, # format: (height, width)
|
shape=shape, # format: (height, width)
|
||||||
@ -291,5 +299,6 @@ class BaseDataset(Dataset):
|
|||||||
normalized=True, # or False
|
normalized=True, # or False
|
||||||
bbox_format="xyxy", # or xywh, ltwh
|
bbox_format="xyxy", # or xywh, ltwh
|
||||||
)
|
)
|
||||||
|
```
|
||||||
"""
|
"""
|
||||||
raise NotImplementedError
|
raise NotImplementedError
|
||||||
|
@ -154,28 +154,40 @@ def verify_image_label(args):
|
|||||||
|
|
||||||
def polygon2mask(imgsz, polygons, color=1, downsample_ratio=1):
|
def polygon2mask(imgsz, polygons, color=1, downsample_ratio=1):
|
||||||
"""
|
"""
|
||||||
|
Convert a list of polygons to a binary mask of the specified image size.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
imgsz (tuple): The image size.
|
imgsz (tuple): The size of the image as (height, width).
|
||||||
polygons (list[np.ndarray]): [N, M], N is the number of polygons, M is the number of points(Be divided by 2).
|
polygons (list[np.ndarray]): A list of polygons. Each polygon is an array with shape [N, M], where
|
||||||
color (int): color
|
N is the number of polygons, and M is the number of points such that M % 2 = 0.
|
||||||
downsample_ratio (int): downsample ratio
|
color (int, optional): The color value to fill in the polygons on the mask. Defaults to 1.
|
||||||
|
downsample_ratio (int, optional): Factor by which to downsample the mask. Defaults to 1.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
(np.ndarray): A binary mask of the specified image size with the polygons filled in.
|
||||||
"""
|
"""
|
||||||
mask = np.zeros(imgsz, dtype=np.uint8)
|
mask = np.zeros(imgsz, dtype=np.uint8)
|
||||||
polygons = np.asarray(polygons, dtype=np.int32)
|
polygons = np.asarray(polygons, dtype=np.int32)
|
||||||
polygons = polygons.reshape((polygons.shape[0], -1, 2))
|
polygons = polygons.reshape((polygons.shape[0], -1, 2))
|
||||||
cv2.fillPoly(mask, polygons, color=color)
|
cv2.fillPoly(mask, polygons, color=color)
|
||||||
nh, nw = (imgsz[0] // downsample_ratio, imgsz[1] // downsample_ratio)
|
nh, nw = (imgsz[0] // downsample_ratio, imgsz[1] // downsample_ratio)
|
||||||
# NOTE: fillPoly first then resize is trying to keep the same way of loss calculation when mask-ratio=1.
|
# Note: fillPoly first then resize is trying to keep the same loss calculation method when mask-ratio=1
|
||||||
return cv2.resize(mask, (nw, nh))
|
return cv2.resize(mask, (nw, nh))
|
||||||
|
|
||||||
|
|
||||||
def polygons2masks(imgsz, polygons, color, downsample_ratio=1):
|
def polygons2masks(imgsz, polygons, color, downsample_ratio=1):
|
||||||
"""
|
"""
|
||||||
|
Convert a list of polygons to a set of binary masks of the specified image size.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
imgsz (tuple): The image size.
|
imgsz (tuple): The size of the image as (height, width).
|
||||||
polygons (list[np.ndarray]): each polygon is [N, M], N is number of polygons, M is number of points (M % 2 = 0)
|
polygons (list[np.ndarray]): A list of polygons. Each polygon is an array with shape [N, M], where
|
||||||
color (int): color
|
N is the number of polygons, and M is the number of points such that M % 2 = 0.
|
||||||
downsample_ratio (int): downsample ratio
|
color (int): The color value to fill in the polygons on the masks.
|
||||||
|
downsample_ratio (int, optional): Factor by which to downsample each mask. Defaults to 1.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
(np.ndarray): A set of binary masks of the specified image size with the polygons filled in.
|
||||||
"""
|
"""
|
||||||
return np.array([polygon2mask(imgsz, [x.reshape(-1)], color, downsample_ratio) for x in polygons])
|
return np.array([polygon2mask(imgsz, [x.reshape(-1)], color, downsample_ratio) for x in polygons])
|
||||||
|
|
||||||
@ -205,7 +217,7 @@ def find_dataset_yaml(path: Path) -> Path:
|
|||||||
Find and return the YAML file associated with a Detect, Segment or Pose dataset.
|
Find and return the YAML file associated with a Detect, Segment or Pose dataset.
|
||||||
|
|
||||||
This function searches for a YAML file at the root level of the provided directory first, and if not found, it
|
This function searches for a YAML file at the root level of the provided directory first, and if not found, it
|
||||||
performs a recursive search. It prefers YAML files that have the samestem as the provided path. An AssertionError
|
performs a recursive search. It prefers YAML files that have the same stem as the provided path. An AssertionError
|
||||||
is raised if no YAML file is found or if multiple YAML files are found.
|
is raised if no YAML file is found or if multiple YAML files are found.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
@ -438,7 +450,8 @@ class HUBDatasetStats:
|
|||||||
self.stats = {'nc': len(data['names']), 'names': list(data['names'].values())} # statistics dictionary
|
self.stats = {'nc': len(data['names']), 'names': list(data['names'].values())} # statistics dictionary
|
||||||
self.data = data
|
self.data = data
|
||||||
|
|
||||||
def _unzip(self, path):
|
@staticmethod
|
||||||
|
def _unzip(path):
|
||||||
"""Unzip data.zip."""
|
"""Unzip data.zip."""
|
||||||
if not str(path).endswith('.zip'): # path is data.yaml
|
if not str(path).endswith('.zip'): # path is data.yaml
|
||||||
return False, None, path
|
return False, None, path
|
||||||
|
@ -171,7 +171,7 @@ class MLP(nn.Module):
|
|||||||
hidden_dim (int): The dimensionality of the hidden layers.
|
hidden_dim (int): The dimensionality of the hidden layers.
|
||||||
output_dim (int): The dimensionality of the output layer.
|
output_dim (int): The dimensionality of the output layer.
|
||||||
num_layers (int): The number of hidden layers.
|
num_layers (int): The number of hidden layers.
|
||||||
sigmoid_output (bool, optional): Whether to apply a sigmoid activation to the output layer. Defaults to False.
|
sigmoid_output (bool, optional): Apply a sigmoid activation to the output layer. Defaults to False.
|
||||||
"""
|
"""
|
||||||
super().__init__()
|
super().__init__()
|
||||||
self.num_layers = num_layers
|
self.num_layers = num_layers
|
||||||
|
@ -10,13 +10,15 @@ import torch.nn.functional as F
|
|||||||
from ultralytics.nn.modules import LayerNorm2d, MLPBlock
|
from ultralytics.nn.modules import LayerNorm2d, MLPBlock
|
||||||
|
|
||||||
|
|
||||||
# This class and its supporting functions below lightly adapted from the ViTDet backbone available at: https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/backbone/vit.py # noqa
|
|
||||||
class ImageEncoderViT(nn.Module):
|
class ImageEncoderViT(nn.Module):
|
||||||
"""
|
"""
|
||||||
An image encoder using Vision Transformer (ViT) architecture for encoding an image into a compact latent space. The
|
An image encoder using Vision Transformer (ViT) architecture for encoding an image into a compact latent space. The
|
||||||
encoder takes an image, splits it into patches, and processes these patches through a series of transformer blocks.
|
encoder takes an image, splits it into patches, and processes these patches through a series of transformer blocks.
|
||||||
The encoded patches are then processed through a neck to generate the final encoded representation.
|
The encoded patches are then processed through a neck to generate the final encoded representation.
|
||||||
|
|
||||||
|
This class and its supporting functions below lightly adapted from the ViTDet backbone available at
|
||||||
|
https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/backbone/vit.py.
|
||||||
|
|
||||||
Attributes:
|
Attributes:
|
||||||
img_size (int): Dimension of input images, assumed to be square.
|
img_size (int): Dimension of input images, assumed to be square.
|
||||||
patch_embed (PatchEmbed): Module for patch embedding.
|
patch_embed (PatchEmbed): Module for patch embedding.
|
||||||
@ -410,6 +412,8 @@ class Attention(nn.Module):
|
|||||||
input_size: Optional[Tuple[int, int]] = None,
|
input_size: Optional[Tuple[int, int]] = None,
|
||||||
) -> None:
|
) -> None:
|
||||||
"""
|
"""
|
||||||
|
Initialize Attention module.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
dim (int): Number of input channels.
|
dim (int): Number of input channels.
|
||||||
num_heads (int): Number of attention heads.
|
num_heads (int): Number of attention heads.
|
||||||
@ -502,8 +506,8 @@ def window_unpartition(windows: torch.Tensor, window_size: int, pad_hw: Tuple[in
|
|||||||
|
|
||||||
def get_rel_pos(q_size: int, k_size: int, rel_pos: torch.Tensor) -> torch.Tensor:
|
def get_rel_pos(q_size: int, k_size: int, rel_pos: torch.Tensor) -> torch.Tensor:
|
||||||
"""
|
"""
|
||||||
Get relative positional embeddings according to the relative positions of
|
Get relative positional embeddings according to the relative positions of query and key sizes.
|
||||||
query and key sizes.
|
|
||||||
Args:
|
Args:
|
||||||
q_size (int): size of query q.
|
q_size (int): size of query q.
|
||||||
k_size (int): size of key k.
|
k_size (int): size of key k.
|
||||||
@ -542,8 +546,9 @@ def add_decomposed_rel_pos(
|
|||||||
k_size: Tuple[int, int],
|
k_size: Tuple[int, int],
|
||||||
) -> torch.Tensor:
|
) -> torch.Tensor:
|
||||||
"""
|
"""
|
||||||
Calculate decomposed Relative Positional Embeddings from :paper:`mvitv2`.
|
Calculate decomposed Relative Positional Embeddings from mvitv2 paper at
|
||||||
https://github.com/facebookresearch/mvit/blob/19786631e330df9f3622e5402b4a419a263a2c80/mvit/models/attention.py # noqa B950
|
https://github.com/facebookresearch/mvit/blob/main/mvit/models/attention.py.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
attn (Tensor): attention map.
|
attn (Tensor): attention map.
|
||||||
q (Tensor): query q in the attention layer with shape (B, q_h * q_w, C).
|
q (Tensor): query q in the attention layer with shape (B, q_h * q_w, C).
|
||||||
@ -583,6 +588,8 @@ class PatchEmbed(nn.Module):
|
|||||||
embed_dim: int = 768,
|
embed_dim: int = 768,
|
||||||
) -> None:
|
) -> None:
|
||||||
"""
|
"""
|
||||||
|
Initialize PatchEmbed module.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
kernel_size (Tuple): kernel size of the projection layer.
|
kernel_size (Tuple): kernel size of the projection layer.
|
||||||
stride (Tuple): stride of the projection layer.
|
stride (Tuple): stride of the projection layer.
|
||||||
|
@ -39,7 +39,8 @@ class TransformerEncoderLayer(nn.Module):
|
|||||||
self.act = act
|
self.act = act
|
||||||
self.normalize_before = normalize_before
|
self.normalize_before = normalize_before
|
||||||
|
|
||||||
def with_pos_embed(self, tensor, pos=None):
|
@staticmethod
|
||||||
|
def with_pos_embed(tensor, pos=None):
|
||||||
"""Add position embeddings to the tensor if provided."""
|
"""Add position embeddings to the tensor if provided."""
|
||||||
return tensor if pos is None else tensor + pos
|
return tensor if pos is None else tensor + pos
|
||||||
|
|
||||||
@ -180,9 +181,10 @@ class LayerNorm2d(nn.Module):
|
|||||||
"""
|
"""
|
||||||
2D Layer Normalization module inspired by Detectron2 and ConvNeXt implementations.
|
2D Layer Normalization module inspired by Detectron2 and ConvNeXt implementations.
|
||||||
|
|
||||||
Original implementation at
|
Original implementations in
|
||||||
https://github.com/facebookresearch/detectron2/blob/main/detectron2/layers/batch_norm.py
|
https://github.com/facebookresearch/detectron2/blob/main/detectron2/layers/batch_norm.py
|
||||||
https://github.com/facebookresearch/ConvNeXt/blob/d1fa8f6fef0a165b27399986cc2bdacc92777e40/models/convnext.py#L119
|
and
|
||||||
|
https://github.com/facebookresearch/ConvNeXt/blob/main/models/convnext.py.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(self, num_channels, eps=1e-6):
|
def __init__(self, num_channels, eps=1e-6):
|
||||||
@ -250,7 +252,7 @@ class MSDeformAttn(nn.Module):
|
|||||||
|
|
||||||
def forward(self, query, refer_bbox, value, value_shapes, value_mask=None):
|
def forward(self, query, refer_bbox, value, value_shapes, value_mask=None):
|
||||||
"""
|
"""
|
||||||
Perform forward pass for multi-scale deformable attention.
|
Perform forward pass for multiscale deformable attention.
|
||||||
|
|
||||||
https://github.com/PaddlePaddle/PaddleDetection/blob/develop/ppdet/modeling/transformers/deformable_transformer.py
|
https://github.com/PaddlePaddle/PaddleDetection/blob/develop/ppdet/modeling/transformers/deformable_transformer.py
|
||||||
|
|
||||||
|
@ -48,8 +48,7 @@ def bbox_ioa(box1, box2, iou=False, eps=1e-7):
|
|||||||
|
|
||||||
def box_iou(box1, box2, eps=1e-7):
|
def box_iou(box1, box2, eps=1e-7):
|
||||||
"""
|
"""
|
||||||
Calculate intersection-over-union (IoU) of boxes.
|
Calculate intersection-over-union (IoU) of boxes. Both sets of boxes are expected to be in (x1, y1, x2, y2) format.
|
||||||
Both sets of boxes are expected to be in (x1, y1, x2, y2) format.
|
|
||||||
Based on https://github.com/pytorch/vision/blob/master/torchvision/ops/boxes.py
|
Based on https://github.com/pytorch/vision/blob/master/torchvision/ops/boxes.py
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
|
@ -61,8 +61,8 @@ class TaskAlignedAssigner(nn.Module):
|
|||||||
"""
|
"""
|
||||||
A task-aligned assigner for object detection.
|
A task-aligned assigner for object detection.
|
||||||
|
|
||||||
This class assigns ground-truth (gt) objects to anchors based on the task-aligned metric,
|
This class assigns ground-truth (gt) objects to anchors based on the task-aligned metric, which combines both
|
||||||
which combines both classification and localization information.
|
classification and localization information.
|
||||||
|
|
||||||
Attributes:
|
Attributes:
|
||||||
topk (int): The number of top candidates to consider.
|
topk (int): The number of top candidates to consider.
|
||||||
@ -85,8 +85,8 @@ class TaskAlignedAssigner(nn.Module):
|
|||||||
@torch.no_grad()
|
@torch.no_grad()
|
||||||
def forward(self, pd_scores, pd_bboxes, anc_points, gt_labels, gt_bboxes, mask_gt):
|
def forward(self, pd_scores, pd_bboxes, anc_points, gt_labels, gt_bboxes, mask_gt):
|
||||||
"""
|
"""
|
||||||
Compute the task-aligned assignment.
|
Compute the task-aligned assignment. Reference code is available at
|
||||||
Reference https://github.com/Nioolek/PPYOLOE_pytorch/blob/master/ppyoloe/assigner/tal_assigner.py
|
https://github.com/Nioolek/PPYOLOE_pytorch/blob/master/ppyoloe/assigner/tal_assigner.py.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
pd_scores (Tensor): shape(bs, num_total_anchors, num_classes)
|
pd_scores (Tensor): shape(bs, num_total_anchors, num_classes)
|
||||||
|
Loading…
x
Reference in New Issue
Block a user