mirror of
https://github.com/THU-MIG/yolov10.git
synced 2025-05-23 13:34:23 +08:00
ultralytics 8.0.185
dependencies and tests fixes (#5046)
Co-authored-by: Zlobin Vladimir <vladimir.zlobin@intel.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
This commit is contained in:
parent
92753ebd84
commit
a9033118cf
6
.github/workflows/ci.yaml
vendored
6
.github/workflows/ci.yaml
vendored
@ -184,9 +184,9 @@ jobs:
|
|||||||
run: | # CoreML must be installed before export due to protobuf error from AutoInstall
|
run: | # CoreML must be installed before export due to protobuf error from AutoInstall
|
||||||
python -m pip install --upgrade pip wheel
|
python -m pip install --upgrade pip wheel
|
||||||
if [ "${{ matrix.torch }}" == "1.8.0" ]; then
|
if [ "${{ matrix.torch }}" == "1.8.0" ]; then
|
||||||
pip install -e . torch==1.8.0 torchvision==0.9.0 pytest-cov "coremltools>=7.0.b1" --extra-index-url https://download.pytorch.org/whl/cpu
|
pip install -e . torch==1.8.0 torchvision==0.9.0 pytest-cov "coremltools>=7.0" --extra-index-url https://download.pytorch.org/whl/cpu
|
||||||
else
|
else
|
||||||
pip install -e . pytest-cov "coremltools>=7.0.b1" --extra-index-url https://download.pytorch.org/whl/cpu
|
pip install -e . pytest-cov "coremltools>=7.0" --extra-index-url https://download.pytorch.org/whl/cpu
|
||||||
fi
|
fi
|
||||||
- name: Check environment
|
- name: Check environment
|
||||||
run: |
|
run: |
|
||||||
@ -267,7 +267,7 @@ jobs:
|
|||||||
conda install -c pytorch -c conda-forge pytorch torchvision ultralytics
|
conda install -c pytorch -c conda-forge pytorch torchvision ultralytics
|
||||||
- name: Install pip packages
|
- name: Install pip packages
|
||||||
run: |
|
run: |
|
||||||
pip install pytest 'coremltools>=7.0.b1' # 'openvino-dev>=2023.0'
|
pip install pytest 'coremltools>=7.0' # 'openvino-dev>=2023.0'
|
||||||
- name: Check environment
|
- name: Check environment
|
||||||
run: |
|
run: |
|
||||||
echo "RUNNER_OS is ${{ runner.os }}"
|
echo "RUNNER_OS is ${{ runner.os }}"
|
||||||
|
119
docs/guides/docker-quickstart.md
Normal file
119
docs/guides/docker-quickstart.md
Normal file
@ -0,0 +1,119 @@
|
|||||||
|
---
|
||||||
|
comments: true
|
||||||
|
description: Complete guide to setting up and using Ultralytics YOLO models with Docker. Learn how to install Docker, manage GPU support, and run YOLO models in isolated containers.
|
||||||
|
keywords: Ultralytics, YOLO, Docker, GPU, containerization, object detection, package installation, deep learning, machine learning, guide
|
||||||
|
---
|
||||||
|
|
||||||
|
# Docker Quickstart Guide for Ultralytics
|
||||||
|
|
||||||
|
This guide serves as a comprehensive introduction to setting up a Docker environment for your Ultralytics projects. Docker is a platform for developing, shipping, and running applications in containers. It is particularly beneficial for ensuring that the software will always run the same, regardless of where it's deployed. For more details, visit the Ultralytics Docker repository on [Docker Hub](https://hub.docker.com/r/ultralytics/ultralytics).
|
||||||
|
|
||||||
|
[](https://hub.docker.com/r/ultralytics/ultralytics)
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
<img width="800" src="https://user-images.githubusercontent.com/26833433/270173601-fc7011bd-e67c-452f-a31a-aa047dcd2771.png" alt="Ultralytics Docker Package Visual">
|
||||||
|
</p>
|
||||||
|
|
||||||
|
## What You Will Learn
|
||||||
|
|
||||||
|
- Setting up Docker with NVIDIA support
|
||||||
|
- Installing Ultralytics Docker images
|
||||||
|
- Running Ultralytics in a Docker container
|
||||||
|
- Mounting local directories into the container
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
- Make sure Docker is installed on your system. If not, you can download and install it from [Docker's website](https://www.docker.com/products/docker-desktop).
|
||||||
|
- Ensure that your system has an NVIDIA GPU and NVIDIA drivers are installed.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Setting up Docker with NVIDIA Support
|
||||||
|
|
||||||
|
First, verify that the NVIDIA drivers are properly installed by running:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
nvidia-smi
|
||||||
|
```
|
||||||
|
|
||||||
|
### Installing NVIDIA Docker Runtime
|
||||||
|
|
||||||
|
Now, let's install the NVIDIA Docker runtime to enable GPU support in Docker containers:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Add NVIDIA package repositories
|
||||||
|
curl -s -L https://nvidia.github.io/nvidia-docker/gpgkey | sudo apt-key add -
|
||||||
|
distribution=$(lsb_release -cs)
|
||||||
|
curl -s -L https://nvidia.github.io/nvidia-docker/$distribution/nvidia-docker.list | sudo tee /etc/apt/sources.list.d/nvidia-docker.list
|
||||||
|
|
||||||
|
# Install NVIDIA Docker runtime
|
||||||
|
sudo apt-get update
|
||||||
|
sudo apt-get install -y nvidia-docker2
|
||||||
|
|
||||||
|
# Restart Docker service to apply changes
|
||||||
|
sudo systemctl restart docker
|
||||||
|
```
|
||||||
|
|
||||||
|
### Verify NVIDIA Runtime with Docker
|
||||||
|
|
||||||
|
Run `docker info | grep -i runtime` to ensure that `nvidia` appears in the list of runtimes:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker info | grep -i runtime
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Installing Ultralytics Docker Images
|
||||||
|
|
||||||
|
Ultralytics offers several Docker images optimized for various platforms and use-cases:
|
||||||
|
|
||||||
|
- **Dockerfile:** GPU image, ideal for training.
|
||||||
|
- **Dockerfile-arm64:** For ARM64 architecture, suitable for devices like Raspberry Pi.
|
||||||
|
- **Dockerfile-cpu:** CPU-only version for inference and non-GPU environments.
|
||||||
|
- **Dockerfile-jetson:** Optimized for NVIDIA Jetson devices.
|
||||||
|
- **Dockerfile-python:** Minimal Python environment for lightweight applications.
|
||||||
|
- **Dockerfile-conda:** Includes Miniconda3 and Ultralytics package installed via Conda.
|
||||||
|
|
||||||
|
To pull the latest image:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Set image name as a variable
|
||||||
|
t=ultralytics/ultralytics:latest
|
||||||
|
|
||||||
|
# Pull the latest Ultralytics image from Docker Hub
|
||||||
|
sudo docker pull $t
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Running Ultralytics in Docker Container
|
||||||
|
|
||||||
|
Here's how to execute the Ultralytics Docker container:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Run with all GPUs
|
||||||
|
sudo docker run -it --ipc=host --gpus all $t
|
||||||
|
|
||||||
|
# Run specifying which GPUs to use
|
||||||
|
sudo docker run -it --ipc=host --gpus '"device=2,3"' $t
|
||||||
|
```
|
||||||
|
|
||||||
|
The `-it` flag assigns a pseudo-TTY and keeps stdin open, allowing you to interact with the container. The `--ipc=host` flag enables sharing of host's IPC namespace, essential for sharing memory between processes. The `--gpus` flag allows the container to access the host's GPUs.
|
||||||
|
|
||||||
|
### Note on File Accessibility
|
||||||
|
|
||||||
|
To work with files on your local machine within the container, you can use Docker volumes:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Mount a local directory into the container
|
||||||
|
sudo docker run -it --ipc=host --gpus all -v /path/on/host:/path/in/container $t
|
||||||
|
```
|
||||||
|
|
||||||
|
Replace `/path/on/host` with the directory path on your local machine and `/path/in/container` with the desired path inside the Docker container.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
Congratulations! You're now set up to use Ultralytics with Docker and ready to take advantage of its powerful capabilities. For alternate installation methods, feel free to explore the [Ultralytics quickstart documentation](https://docs.ultralytics.com/quickstart/).
|
@ -19,6 +19,7 @@ Here's a compilation of in-depth guides to help you master different aspects of
|
|||||||
* [Using YOLOv8 with SAHI for Sliced Inference](sahi-tiled-inference.md) 🚀 NEW: Comprehensive guide on leveraging SAHI's sliced inference capabilities with YOLOv8 for object detection in high-resolution images.
|
* [Using YOLOv8 with SAHI for Sliced Inference](sahi-tiled-inference.md) 🚀 NEW: Comprehensive guide on leveraging SAHI's sliced inference capabilities with YOLOv8 for object detection in high-resolution images.
|
||||||
* [AzureML Quickstart](azureml-quickstart.md) 🚀 NEW: Get up and running with Ultralytics YOLO models on Microsoft's Azure Machine Learning platform. Learn how to train, deploy, and scale your object detection projects in the cloud.
|
* [AzureML Quickstart](azureml-quickstart.md) 🚀 NEW: Get up and running with Ultralytics YOLO models on Microsoft's Azure Machine Learning platform. Learn how to train, deploy, and scale your object detection projects in the cloud.
|
||||||
* [Conda Quickstart](conda-quickstart.md) 🚀 NEW: Step-by-step guide to setting up a Conda environment for Ultralytics. Learn how to install and start using the Ultralytics package efficiently with Conda.
|
* [Conda Quickstart](conda-quickstart.md) 🚀 NEW: Step-by-step guide to setting up a Conda environment for Ultralytics. Learn how to install and start using the Ultralytics package efficiently with Conda.
|
||||||
|
* [Docker Quickstart](docker-quickstart.md) 🚀 NEW: Complete guide to setting up and using Ultralytics YOLO models with Docker. Learn how to install Docker, manage GPU support, and run YOLO models in isolated containers for consistent development and deployment.
|
||||||
|
|
||||||
## Contribute to Our Guides
|
## Contribute to Our Guides
|
||||||
|
|
||||||
|
@ -114,6 +114,8 @@ Ultralytics provides various installation methods including pip, conda, and Dock
|
|||||||
|
|
||||||
Alter `/path/on/host` with the directory path on your local machine, and `/path/in/container` with the desired path inside the Docker container for accessibility.
|
Alter `/path/on/host` with the directory path on your local machine, and `/path/in/container` with the desired path inside the Docker container for accessibility.
|
||||||
|
|
||||||
|
For advanced Docker usage, feel free to explore the [Ultralytics Docker Guide](https://docs.ultralytics.com/guides/docker-quickstart/).
|
||||||
|
|
||||||
See the `ultralytics` [requirements.txt](https://github.com/ultralytics/ultralytics/blob/main/requirements.txt) file for a list of dependencies. Note that all examples above install all required dependencies.
|
See the `ultralytics` [requirements.txt](https://github.com/ultralytics/ultralytics/blob/main/requirements.txt) file for a list of dependencies. Note that all examples above install all required dependencies.
|
||||||
|
|
||||||
!!! tip "Tip"
|
!!! tip "Tip"
|
||||||
|
@ -219,6 +219,7 @@ nav:
|
|||||||
- SAHI Tiled Inference: guides/sahi-tiled-inference.md
|
- SAHI Tiled Inference: guides/sahi-tiled-inference.md
|
||||||
- AzureML Quickstart: guides/azureml-quickstart.md
|
- AzureML Quickstart: guides/azureml-quickstart.md
|
||||||
- Conda Quickstart: guides/conda-quickstart.md
|
- Conda Quickstart: guides/conda-quickstart.md
|
||||||
|
- Docker Quickstart: guides/docker-quickstart.md
|
||||||
- Integrations:
|
- Integrations:
|
||||||
- integrations/index.md
|
- integrations/index.md
|
||||||
- OpenVINO: integrations/openvino.md
|
- OpenVINO: integrations/openvino.md
|
||||||
|
@ -24,7 +24,7 @@ pandas>=1.1.4
|
|||||||
seaborn>=0.11.0
|
seaborn>=0.11.0
|
||||||
|
|
||||||
# Export --------------------------------------
|
# Export --------------------------------------
|
||||||
# coremltools>=7.0.b1 # CoreML export
|
# coremltools>=7.0 # CoreML export
|
||||||
# onnx>=1.12.0 # ONNX export
|
# onnx>=1.12.0 # ONNX export
|
||||||
# onnxsim>=0.4.1 # ONNX simplifier
|
# onnxsim>=0.4.1 # ONNX simplifier
|
||||||
# nvidia-pyindex # TensorRT export
|
# nvidia-pyindex # TensorRT export
|
||||||
|
2
setup.py
2
setup.py
@ -71,7 +71,7 @@ setup(
|
|||||||
'mkdocs-ultralytics-plugin>=0.0.27', # for meta descriptions and images, dates and authors
|
'mkdocs-ultralytics-plugin>=0.0.27', # for meta descriptions and images, dates and authors
|
||||||
],
|
],
|
||||||
'export': [
|
'export': [
|
||||||
'coremltools>=7.0.b1',
|
'coremltools>=7.0',
|
||||||
'openvino-dev>=2023.0',
|
'openvino-dev>=2023.0',
|
||||||
'tensorflowjs', # automatically installs tensorflow
|
'tensorflowjs', # automatically installs tensorflow
|
||||||
], },
|
], },
|
||||||
|
@ -214,6 +214,7 @@ def test_export_paddle(enabled=False):
|
|||||||
YOLO(MODEL).export(format='paddle')
|
YOLO(MODEL).export(format='paddle')
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.slow
|
||||||
def test_export_ncnn():
|
def test_export_ncnn():
|
||||||
f = YOLO(MODEL).export(format='ncnn')
|
f = YOLO(MODEL).export(format='ncnn')
|
||||||
YOLO(f)(SOURCE) # exported model inference
|
YOLO(f)(SOURCE) # exported model inference
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
# Ultralytics YOLO 🚀, AGPL-3.0 license
|
# Ultralytics YOLO 🚀, AGPL-3.0 license
|
||||||
|
|
||||||
__version__ = '8.0.184'
|
__version__ = '8.0.185'
|
||||||
|
|
||||||
from ultralytics.models import RTDETR, SAM, YOLO
|
from ultralytics.models import RTDETR, SAM, YOLO
|
||||||
from ultralytics.models.fastsam import FastSAM
|
from ultralytics.models.fastsam import FastSAM
|
||||||
|
@ -571,8 +571,6 @@ class LetterBox:
|
|||||||
if self.center:
|
if self.center:
|
||||||
dw /= 2 # divide padding into 2 sides
|
dw /= 2 # divide padding into 2 sides
|
||||||
dh /= 2
|
dh /= 2
|
||||||
if labels.get('ratio_pad'):
|
|
||||||
labels['ratio_pad'] = (labels['ratio_pad'], (dw, dh)) # for evaluation
|
|
||||||
|
|
||||||
if shape[::-1] != new_unpad: # resize
|
if shape[::-1] != new_unpad: # resize
|
||||||
img = cv2.resize(img, new_unpad, interpolation=cv2.INTER_LINEAR)
|
img = cv2.resize(img, new_unpad, interpolation=cv2.INTER_LINEAR)
|
||||||
@ -580,6 +578,8 @@ class LetterBox:
|
|||||||
left, right = int(round(dw - 0.1)) if self.center else 0, int(round(dw + 0.1))
|
left, right = int(round(dw - 0.1)) if self.center else 0, int(round(dw + 0.1))
|
||||||
img = cv2.copyMakeBorder(img, top, bottom, left, right, cv2.BORDER_CONSTANT,
|
img = cv2.copyMakeBorder(img, top, bottom, left, right, cv2.BORDER_CONSTANT,
|
||||||
value=(114, 114, 114)) # add border
|
value=(114, 114, 114)) # add border
|
||||||
|
if labels.get('ratio_pad'):
|
||||||
|
labels['ratio_pad'] = (labels['ratio_pad'], (left, top)) # for evaluation
|
||||||
|
|
||||||
if len(labels):
|
if len(labels):
|
||||||
labels = self._update_labels(labels, ratio, dw, dh)
|
labels = self._update_labels(labels, ratio, dw, dh)
|
||||||
|
@ -507,7 +507,7 @@ class Exporter:
|
|||||||
def export_coreml(self, prefix=colorstr('CoreML:')):
|
def export_coreml(self, prefix=colorstr('CoreML:')):
|
||||||
"""YOLOv8 CoreML export."""
|
"""YOLOv8 CoreML export."""
|
||||||
mlmodel = self.args.format.lower() == 'mlmodel' # legacy *.mlmodel export format requested
|
mlmodel = self.args.format.lower() == 'mlmodel' # legacy *.mlmodel export format requested
|
||||||
check_requirements('coremltools>=6.0,<=6.2' if mlmodel else 'coremltools>=7.0.b1')
|
check_requirements('coremltools>=6.0,<=6.2' if mlmodel else 'coremltools>=7.0')
|
||||||
import coremltools as ct # noqa
|
import coremltools as ct # noqa
|
||||||
|
|
||||||
LOGGER.info(f'\n{prefix} starting export with coremltools {ct.__version__}...')
|
LOGGER.info(f'\n{prefix} starting export with coremltools {ct.__version__}...')
|
||||||
@ -648,7 +648,7 @@ class Exporter:
|
|||||||
check_requirements(f"tensorflow{'-macos' if MACOS else '-aarch64' if ARM64 else '' if cuda else '-cpu'}")
|
check_requirements(f"tensorflow{'-macos' if MACOS else '-aarch64' if ARM64 else '' if cuda else '-cpu'}")
|
||||||
import tensorflow as tf # noqa
|
import tensorflow as tf # noqa
|
||||||
check_requirements(
|
check_requirements(
|
||||||
('onnx', 'onnx2tf>=1.15.4', 'sng4onnx>=1.0.1', 'onnxsim>=0.4.33', 'onnx_graphsurgeon>=0.3.26',
|
('onnx', 'onnx2tf>=1.15.4,<=1.17.5', 'sng4onnx>=1.0.1', 'onnxsim>=0.4.33', 'onnx_graphsurgeon>=0.3.26',
|
||||||
'tflite_support', 'onnxruntime-gpu' if cuda else 'onnxruntime'),
|
'tflite_support', 'onnxruntime-gpu' if cuda else 'onnxruntime'),
|
||||||
cmds='--extra-index-url https://pypi.ngc.nvidia.com') # onnx_graphsurgeon only on NVIDIA
|
cmds='--extra-index-url https://pypi.ngc.nvidia.com') # onnx_graphsurgeon only on NVIDIA
|
||||||
|
|
||||||
|
@ -61,9 +61,7 @@ def parse_version(version='0.0.0') -> tuple:
|
|||||||
(tuple): Tuple of integers representing the numeric part of the version and the extra string, i.e. (2, 0, 1)
|
(tuple): Tuple of integers representing the numeric part of the version and the extra string, i.e. (2, 0, 1)
|
||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
correct = [True if x == '.' else x.isdigit() for x in version] # first non-number index
|
return tuple(map(int, re.findall(r'\d+', version)[:3])) # '2.0.1+cpu' -> (2, 0, 1)
|
||||||
v = version[:correct.index(False)] if False in correct else version
|
|
||||||
return tuple(map(int, v.split('.'))) # '2.0.1+cpu' -> (2, 0, 1)
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
LOGGER.warning(f'WARNING ⚠️ failure for parse_version({version}), reverting to deprecated pkg_resources: {e}')
|
LOGGER.warning(f'WARNING ⚠️ failure for parse_version({version}), reverting to deprecated pkg_resources: {e}')
|
||||||
import pkg_resources
|
import pkg_resources
|
||||||
@ -166,9 +164,12 @@ def check_version(current: str = '0.0.0',
|
|||||||
# check if current version is between 20.04 (inclusive) and 22.04 (exclusive)
|
# check if current version is between 20.04 (inclusive) and 22.04 (exclusive)
|
||||||
check_version(current='21.10', required='>20.04,<22.04')
|
check_version(current='21.10', required='>20.04,<22.04')
|
||||||
"""
|
"""
|
||||||
if not (current and required): # if any inputs missing
|
if not current: # if current is '' or None
|
||||||
LOGGER.warning(f'WARNING ⚠️ invalid check_version({current}, {required}) requested, please check values.')
|
LOGGER.warning(f'WARNING ⚠️ invalid check_version({current}, {required}) requested, please check values.')
|
||||||
return True # in case required is '' or None
|
return True
|
||||||
|
|
||||||
|
if not required: # if required is '' or None
|
||||||
|
return True
|
||||||
|
|
||||||
current = parse_version(current) # '1.2.3' -> (1, 2, 3)
|
current = parse_version(current) # '1.2.3' -> (1, 2, 3)
|
||||||
constraints = re.findall(r'([<>!=]{1,2}\s*\d+\.\d+)', required) or [f'>={required}']
|
constraints = re.findall(r'([<>!=]{1,2}\s*\d+\.\d+)', required) or [f'>={required}']
|
||||||
@ -492,9 +493,9 @@ def collect_system_info():
|
|||||||
f"{'CPU':<20}{get_cpu_info()}\n"
|
f"{'CPU':<20}{get_cpu_info()}\n"
|
||||||
f"{'CUDA':<20}{torch.version.cuda if torch and torch.cuda.is_available() else None}\n")
|
f"{'CUDA':<20}{torch.version.cuda if torch and torch.cuda.is_available() else None}\n")
|
||||||
|
|
||||||
if (ROOT.parent / 'requirements.txt').exists(): # pip install
|
if (ROOT.parent / 'requirements.txt').exists(): # git install
|
||||||
requirements = parse_requirements()
|
requirements = parse_requirements()
|
||||||
else: # git install
|
else: # pip install
|
||||||
from pkg_resources import get_distribution
|
from pkg_resources import get_distribution
|
||||||
requirements = get_distribution('ultralytics').requires()
|
requirements = get_distribution('ultralytics').requires()
|
||||||
|
|
||||||
|
@ -16,8 +16,8 @@ from ultralytics.utils import LOGGER, TQDM, checks, clean_url, emojis, is_online
|
|||||||
|
|
||||||
# Define Ultralytics GitHub assets maintained at https://github.com/ultralytics/assets
|
# Define Ultralytics GitHub assets maintained at https://github.com/ultralytics/assets
|
||||||
GITHUB_ASSETS_REPO = 'ultralytics/assets'
|
GITHUB_ASSETS_REPO = 'ultralytics/assets'
|
||||||
GITHUB_ASSETS_NAMES = [f'yolov8{k}{suffix}.pt' for k in 'nsmlx' for suffix in ('', '6', '-cls', '-seg', '-pose')] + \
|
GITHUB_ASSETS_NAMES = [f'yolov8{k}{suffix}.pt' for k in 'nsmlx' for suffix in ('', '-cls', '-seg', '-pose')] + \
|
||||||
[f'yolov5{k}u.pt' for k in 'nsmlx'] + \
|
[f'yolov5{k}{resolution}u.pt' for k in 'nsmlx' for resolution in ('', '6')] + \
|
||||||
[f'yolov3{k}u.pt' for k in ('', '-spp', '-tiny')] + \
|
[f'yolov3{k}u.pt' for k in ('', '-spp', '-tiny')] + \
|
||||||
[f'yolo_nas_{k}.pt' for k in 'sml'] + \
|
[f'yolo_nas_{k}.pt' for k in 'sml'] + \
|
||||||
[f'sam_{k}.pt' for k in 'bl'] + \
|
[f'sam_{k}.pt' for k in 'bl'] + \
|
||||||
@ -176,10 +176,11 @@ def check_disk_space(url='https://ultralytics.com/assets/coco128.zip', sf=1.5, h
|
|||||||
Returns:
|
Returns:
|
||||||
(bool): True if there is sufficient disk space, False otherwise.
|
(bool): True if there is sufficient disk space, False otherwise.
|
||||||
"""
|
"""
|
||||||
r = requests.head(url) # response
|
try:
|
||||||
|
r = requests.head(url) # response
|
||||||
# Check response
|
assert r.status_code < 400, f'URL error for {url}: {r.status_code} {r.reason}' # check response
|
||||||
assert r.status_code < 400, f'URL error for {url}: {r.status_code} {r.reason}'
|
except Exception:
|
||||||
|
return True # requests issue, default to True
|
||||||
|
|
||||||
# Check file size
|
# Check file size
|
||||||
gib = 1 << 30 # bytes per GiB
|
gib = 1 << 30 # bytes per GiB
|
||||||
|
Loading…
x
Reference in New Issue
Block a user