Skip to content

Commit 9de3298

Browse files
committed
S3 fix
1 parent d6f5cd3 commit 9de3298

File tree

6 files changed

+39
-31
lines changed

6 files changed

+39
-31
lines changed

apps/gradio/stable-diffusion/README.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -165,7 +165,7 @@ ovhai app logs <app_id> --follow
165165

166166
When your app is ready for use, you will be able to generate your first images using the default checkpoint.
167167

168-
For your information, the `--volume` parameter allows to use both Swift and S3 buckets. However, it's important to note that for S3 usage, a proper configuration is necessary. If S3 is not configured yet and you wish to use it, please read the [S3 compliance guide](/pages/public_cloud/ai_machine_learning/gi_08_s3_compliance).
168+
For your information, the `--volume` parameter allows to use both Swift and S3* compatible Object Storage buckets. However, it's important to note that for S3 compatible usage, a proper configuration is necessary. If S3 compatible is not configured yet and you wish to use it, please read the [S3 compatible compliance guide](/pages/public_cloud/ai_machine_learning/gi_08_s3_compliance).
169169

170170
### Step 3: Add Stable Diffusion checkpoints
171171

@@ -268,3 +268,5 @@ If you need training or technical assistance to implement our solutions, contact
268268
Please send us your questions, feedback and suggestions to improve the service:
269269

270270
- On the OVHcloud [Discord server](https://discord.com/invite/vXVurFfwe9)
271+
272+
**\***: S3 is a trademark of Amazon Technologies, Inc. OVHcloud’s service is not sponsored by, endorsed by, or otherwise affiliated with Amazon Technologies, Inc.

apps/streamlit/whisper/README.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -108,7 +108,7 @@ You can create your Object Storage bucket using either the UI (OVHcloud Control
108108
>>
109109
>> *`GRA` alias and `whisper-model` will be used in this tutorial.*
110110
>>
111-
>> For your information, the previous command is applicable to both Swift and S3 buckets. However, it's important to note that for S3 usage, a proper configuration is necessary. If S3 is not configured yet and you wish to use it, please read the [S3 compliance guide](/pages/public_cloud/ai_machine_learning/gi_08_s3_compliance).
111+
>> For your information, the previous command is applicable to both Swift and S3* compatible Object Storage buckets. However, it's important to note that for S3 compatible usage, a proper configuration is necessary. If S3 compatible is not configured yet and you wish to use it, please read the [S3 compatible compliance guide](/pages/public_cloud/ai_machine_learning/gi_08_s3_compliance).
112112
113113
#### Download whisper in the created bucket
114114
@@ -313,3 +313,5 @@ If you need training or technical assistance to implement our solutions, contact
313313
Please send us your questions, feedback and suggestions to improve the service:
314314

315315
- On the OVHcloud [Discord server](https://discord.com/invite/vXVurFfwe9)
316+
317+
**\***: S3 is a trademark of Amazon Technologies, Inc. OVHcloud’s service is not sponsored by, endorsed by, or otherwise affiliated with Amazon Technologies, Inc.

jobs/neuralangelo/Makefile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ OUTPUT_DIR := logs/$(GROUP)/$(MODEL)
77
DOWNSAMPLE_RATE ?= 2
88
SCENE_TYPE ?= object
99

10-
# S3 config
10+
# S3* compatible config
1111
DATASTORE ?= NEURALANGELO
1212
BUCKET_NAME := neuralangelo-$(shell whoami)-$(MODEL)
1313
BUCKET := $(BUCKET_NAME)@$(DATASTORE)

jobs/neuralangelo/README.md

Lines changed: 14 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ The processing will follow 3 main steps :
2626
- 3D model extraction
2727

2828
Each step will be run using an AI-Training job and these jobs will share their data using an AI-Training volume synced
29-
with a S3 bucket.
29+
with a S3* compatible bucket.
3030

3131

3232
### Makefile
@@ -92,27 +92,27 @@ Download the sample video:
9292
gdown 1yWoZ4Hk3FgmV3pd34ZbW7jEqgqyJgzHy -O neuralangelo/input/
9393
```
9494

95-
### Configure an S3 bucket for ovhai
95+
### Configure an S3 compatible bucket for ovhai
9696

97-
To be able to share data between the AI Training jobs we will run as well as providing code and data to our workloads, we need to configure an AI datastore pointing to an S3 endpoint.
97+
To be able to share data between the AI Training jobs we will run as well as providing code and data to our workloads, we need to configure an AI datastore pointing to an S3 compatible endpoint.
9898

9999
```shell
100100
ovhai datastore add s3 NEURALANGELO <s3_endpoint_url> <s3_region> <s3_access_key> --store-credentials-locally
101101
```
102102

103103
>
104-
> Data store information (endpoint, region, access_key and secret key) can refer to an OVHcloud S3 bucket or any other provider.
104+
> Data store information (endpoint, region, access_key and secret key) can refer to an OVHcloud S3 compatible bucket or any other provider.
105105
>
106106
> Using `--store-credentials-locally` is needed here to be able to push/pull data from a bucket, using ovhai CLI in the next steps.
107107
>
108-
> See [this page](https://help.ovhcloud.com/csm/en-gb-public-cloud-ai-s3-compliance?id=kb_article_view&sysparm_article=KB0058011) for help about S3 usage.
108+
> See [this page](https://help.ovhcloud.com/csm/en-gb-public-cloud-ai-s3-compliance?id=kb_article_view&sysparm_article=KB0058011) for help about S3 compatible usage.
109109
>
110110
111111
### Prepare model input using COLMAP
112112

113113
Data preparation relies on the process described in [Neuralangelo documentation](https://github.com/NVlabs/neuralangelo/blob/main/DATA_PROCESSING.md).
114114

115-
#### Push the Neuralangelo project in the S3 bucket
115+
#### Push the Neuralangelo project in the S3 compatible bucket
116116

117117
```shell
118118
make push-data
@@ -125,7 +125,7 @@ make push-data
125125
> ovhai bucket object upload neuralangelo-experiments-lego@NEURALANGELO .
126126
> ```
127127
>
128-
> Note: As a bucket shall be unique in an S3 region, the Makefile uses the current username in the bucket name (`experiments` in this example).
128+
> Note: As a bucket shall be unique in an S3 compatible region, the Makefile uses the current username in the bucket name (`experiments` in this example).
129129
>
130130
131131
#### Extract pictures from the video
@@ -174,7 +174,7 @@ make prepare-status
174174
make prepare-logs
175175
```
176176

177-
Once the job is done, we get generated data from the S3 bucket:
177+
Once the job is done, we get generated data from the S3 compatible bucket:
178178

179179
```shell
180180
make pull-data
@@ -208,7 +208,7 @@ make adjust
208208
209209
Follow the process described [here](https://github.com/mli0603/BlenderNeuralangelo?tab=readme-ov-file#2-locating-the-control-panel) to adjust the bounding sphere.
210210
211-
Push the adjusted configuration in the S3 bucket:
211+
Push the adjusted configuration in the S3 compatible bucket:
212212
213213
```shell
214214
make push-data
@@ -313,7 +313,7 @@ make extract-status
313313
make extract-logs
314314
```
315315

316-
Once the job is done, we get generated data from the S3 bucket:
316+
Once the job is done, we get generated data from the S3 compatible bucket:
317317

318318
```shell
319319
make pull-data
@@ -340,14 +340,14 @@ in `neuralangelo/projects/neuralangelo/configs/base.yaml`.
340340
It is possible to change it:
341341

342342
- either using `torchrun` command line parameters.
343-
- or editing the file directly and sync it to the S3 bucket using `make data-push`.
343+
- or editing the file directly and sync it to the S3 compatible bucket using `make data-push`.
344344

345345
### Checkpoints rendering
346346

347347
If the process is configured with a large amount of iterations, the processing can be long. As Neuralangelo creates
348348
intermediate checkpoints, we are able to try extraction on any intermediate model.
349349

350-
To perform this, we need use `ovhai` to trigger a `data-push` on the running job to sync the S3 content and use
350+
To perform this, we need use `ovhai` to trigger a `data-push` on the running job to sync the S3 compatible content and use
351351
the previously described `make extract` command.
352352

353353
If you need training or technical assistance to implement our solutions, contact your sales representative or click on [this link](https://www.ovhcloud.com/en-gb/professional-services/) to get a quote and ask our Professional Services experts for a custom analysis of your project.
@@ -357,3 +357,5 @@ If you need training or technical assistance to implement our solutions, contact
357357
Please send us your questions, feedback and suggestions to improve the service:
358358

359359
- On the OVHcloud [Discord server](https://discord.com/invite/vXVurFfwe9)
360+
361+
**\***: S3 is a trademark of Amazon Technologies, Inc. OVHcloud’s service is not sponsored by, endorsed by, or otherwise affiliated with Amazon Technologies, Inc.

notebooks/computer-vision/object-detection/miniconda/yolov8/notebook_object_detection_yolov8_rock-paper-scissors.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@
4747
"\n",
4848
"- create a Roboflow account\n",
4949
"- click on `Download` in order to download the dataset\n",
50-
"- select`YOLO v8 PyTorch` format\n",
50+
"- select `YOLOv8` format\n",
5151
"- choose the method `show download code`\n",
5252
"\n",
5353
"You will get a URL (`<dataset_url>`) that will allow you to download your dataset directly inside the notebook.\n",

notebooks/getting-started/S3/use-s3-buckets-with-ai-tools.ipynb

Lines changed: 17 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -5,28 +5,28 @@
55
"id": "ac60a3db-eb56-4fd4-b139-95114faaee64",
66
"metadata": {},
77
"source": [
8-
"# Using objects from your S3 buckets in OVHcloud AI Tools\n",
8+
"# Using objects from your S3* compatible buckets in OVHcloud AI Tools\n",
99
"\n",
10-
"This tutorial provides help to manage and use S3 buckets with AI Tools in Python, using the `boto3` library. We will show you how you can interact with your S3 Buckets and files by creating buckets, downloading objects, listing objects and reading their content when working with AI Notebooks, AI Training and AI Deploy.\n",
10+
"This tutorial provides help to manage and use S3* compatible buckets with AI Tools in Python, using the `boto3` library. We will show you how you can interact with your S3 compatible Buckets and files by creating buckets, downloading objects, listing objects and reading their content when working with AI Notebooks, AI Training and AI Deploy.\n",
1111
"\n",
1212
"## Requirements\n",
1313
"\n",
14-
"To be able to follow this tutorial, you will need to have followed the [Data - S3 compliance with AI Tools documentation](https://help.ovhcloud.com/csm/en-gb-public-cloud-ai-s3-compliance?id=kb_article_view&sysparm_article=KB0058011) first, in particular the following steps:\n",
14+
"To be able to follow this tutorial, you will need to have followed the [Data - Compliance between AI Tools and S3 compatible Object Storage](https://help.ovhcloud.com/csm/en-gb-public-cloud-ai-s3-compliance?id=kb_article_view&sysparm_article=KB0058011) first, in particular the following steps:\n",
1515
"\n",
16-
"- Have created a S3 user\n",
16+
"- Have created a S3 compatible user\n",
1717
"- Checked that this user has ***ObjectStore operator*** and ***AI Training Operator*** rights\n",
1818
"- Have created a datastore with this user\n",
1919
"\n",
2020
"## Code\n",
2121
"\n",
2222
"The different steps are as follow:\n",
2323
"- Setup the environment\n",
24-
"- Set your S3 datastore\n",
25-
"- List all S3 buckets in your S3 datastore\n",
24+
"- Set your S3 compatible datastore\n",
25+
"- List all S3 compatible buckets in your S3 compatible datastore\n",
2626
"- Create a new bucket\n",
2727
"- List all objects of a specific bucket\n",
2828
"- Read content from objects\n",
29-
"- Download object from S3 bucket\n",
29+
"- Download object from S3 compatible bucket\n",
3030
"\n",
3131
"### Setup the environment\n",
3232
"\n",
@@ -72,15 +72,15 @@
7272
"id": "8347366f-98ec-4f4b-87c8-f53aea2b20e7",
7373
"metadata": {},
7474
"source": [
75-
"### Set your S3 datastore"
75+
"### Set your S3 compatible datastore"
7676
]
7777
},
7878
{
7979
"cell_type": "markdown",
8080
"id": "10bc5759-97ce-472c-8c96-cb7224387bcc",
8181
"metadata": {},
8282
"source": [
83-
"To interact with an S3 bucket, we need to initialize a S3 client and configure it with our user credentials (`s3_access_key`, `s3_secret_key`, the `endpoint URL`, and the selected region).\n",
83+
"To interact with an S3 compatible bucket, we need to initialize a S3 compatible client and configure it with our user credentials (`s3_access_key`, `s3_secret_key`, the `endpoint URL`, and the selected region).\n",
8484
"\n",
8585
"***Make sure to replace these credentials by yours.***"
8686
]
@@ -114,9 +114,9 @@
114114
"id": "84a88d2f-5a8a-4c82-a4b3-9fc7b5d24ec1",
115115
"metadata": {},
116116
"source": [
117-
"Once the S3 client has been initialized, we are ready to communicate with the S3-compatible storage service. Many things can be done.\n",
117+
"Once the S3 compatible client has been initialized, we are ready to communicate with the S3 compatible storage service. Many things can be done.\n",
118118
"\n",
119-
"### List all S3 buckets in your S3 datastore"
119+
"### List all S3 compatible buckets in your S3 compatible datastore"
120120
]
121121
},
122122
{
@@ -431,9 +431,9 @@
431431
"id": "12ea21bf-bde2-481e-b9ea-be327ec365f5",
432432
"metadata": {},
433433
"source": [
434-
"### Download object from S3 bucket\n",
434+
"### Download object from S3 compatible bucket\n",
435435
"\n",
436-
"You can download any object from your S3 bucket into your environment. Here is how to download the `requirements.txt` file under the name `local-object.txt`"
436+
"You can download any object from your S3 compatible bucket into your environment. Here is how to download the `requirements.txt` file under the name `local-object.txt`"
437437
]
438438
},
439439
{
@@ -466,11 +466,13 @@
466466
"source": [
467467
"### Conclusion\n",
468468
"\n",
469-
"We hope this example has helped you to manipulate the objects in your S3 buckets directly from the OVHcloud AI Tools products. \n",
469+
"We hope this example has helped you to manipulate the objects in your S3 compatible buckets directly from the OVHcloud AI Tools products. \n",
470470
"\n",
471471
"The operations presented here are not the only possible actions. Please consult the documentation for a full list of available commands.\n",
472472
"\n",
473-
"More commands here : https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html"
473+
"More commands here : https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html\n",
474+
"\n",
475+
"**\\***: S3 is a trademark of Amazon Technologies, Inc. OVHcloud’s service is not sponsored by, endorsed by, or otherwise affiliated with Amazon Technologies, Inc."
474476
]
475477
}
476478
],

0 commit comments

Comments
 (0)