Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions menu/changelogs.json
Original file line number Diff line number Diff line change
Expand Up @@ -150,8 +150,8 @@
"label": "Data Warehouse for ClickHouse®"
},
{
"category": "data-lab",
"label": "Data Lab for Apache Spark™"
"category": "apache-spark",
"label": "Clusters for Apache Spark™"
},
{
"category": "nats",
Expand Down
4 changes: 2 additions & 2 deletions menu/filters.json
Original file line number Diff line number Diff line change
Expand Up @@ -115,8 +115,8 @@
{
"items": [
{
"category": "data-lab",
"label": "Data Lab for Apache Spark™"
"category": "apache-spark",
"label": "Clusters for Apache Spark™"
},
{
"category": "nats",
Expand Down
2 changes: 1 addition & 1 deletion menu/navigation.ts
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ import { clustersForKafkaMenu } from "../pages/clusters-for-kafka/menu"
import { cockpitMenu } from "../pages/cockpit/menu"
import { containerRegistryMenu } from "../pages/container-registry/menu"
import { cpanelHostingMenu } from "../pages/cpanel-hosting/menu"
import { dataLabMenu } from "../pages/data-lab/menu"
import { dataLabMenu } from "../pages/apache-spark/menu"
import { dataOrchestratorMenu } from "../pages/data-orchestrator/menu"
import { dataWarehouseMenu } from "../pages/data-warehouse/menu"
import { dediboxMenu } from "../pages/dedibox/menu"
Expand Down
File renamed without changes.
2 changes: 1 addition & 1 deletion pages/data-lab/faq.mdx → pages/apache-spark/faq.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ Yes, you can run your cluster on either CPUs or GPUs. Scaleway leverages Nvidia'

Yes, you can connect a different notebook via Private Networks.

Refer to the [dedicated documentation](/data-lab/how-to/use-private-networks/) for comprehensive information on how to connect to an Apache Spark™ cluster over Private Networks.
Refer to the [dedicated documentation](/apache-spark/how-to/use-private-networks/) for comprehensive information on how to connect to an Apache Spark™ cluster over Private Networks.

## Usage and management

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,13 +11,13 @@ import Requirements from '@macros/iam/requirements.mdx'

This page explains how to access and use the notebook environment of your Apache Spark™ cluster using the Scaleway console.

You can also use your Apache Spark™ cluster using a separate notebook (JupyterLab, Zeppelin, etc.) running on a Scaleway Instance within the same Private Network as your cluster. Refer to the [dedicated documentation](/data-lab/how-to/use-private-networks/) for more information.
You can also use your Apache Spark™ cluster using a separate notebook (JupyterLab, Zeppelin, etc.) running on a Scaleway Instance within the same Private Network as your cluster. Refer to the [dedicated documentation](/apache-spark/how-to/use-private-networks/) for more information.

<Requirements />

- A Scaleway account logged into the [console](https://console.scaleway.com)
- [Owner](/iam/concepts/#owner) status or [IAM permissions](/iam/concepts/#permission) allowing you to perform actions in the intended Organization
- Created an [Apache Spark™ cluster](/data-lab/how-to/create-data-lab/) with a notebook
- Created an [Apache Spark™ cluster](/apache-spark/how-to/create-spark-cluster/) with a notebook
- Created an [IAM API key](/iam/how-to/create-api-keys/)

## How to access the notebook of your cluster
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ This page explains how to access the Apache Spark™ UI of your Apache Spark™

- A Scaleway account logged into the [console](https://console.scaleway.com)
- [Owner](/iam/concepts/#owner) status or [IAM permissions](/iam/concepts/#permission) allowing you to perform actions in the intended Organization
- Created an [Apache Spark™ cluster](/data-lab/how-to/create-data-lab/)
- Created an [Apache Spark™ cluster](/apache-spark/how-to/create-spark-cluster/)
- Created an [IAM API key](/iam/how-to/create-api-keys/)

1. Click **Apache Spark™** under **Data & Analytics** on the side menu. The **Clusters for Apache Spark™** page displays.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ Clusters for Apache Spark™ is a product designed to assist data scientists and

6. Enter the desired number of worker nodes.

7. Add a [persistent volume](/data-lab/concepts/#persistent-volume) if required, then enter a volume size according to your needs.
7. Add a [persistent volume](/apache-spark/concepts/#persistent-volume) if required, then enter a volume size according to your needs.

<Message type="note">
Persistent volume usage depends on your workload, and only the actual usage will be billed, within the limit defined. A minimum of 1 GB is required to run the notebook.
Expand Down
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ This page explains how to manage and delete your Apache Spark™ cluster.

- A Scaleway account logged into the [console](https://console.scaleway.com)
- [Owner](/iam/concepts/#owner) status or [IAM permissions](/iam/concepts/#permission) allowing you to perform actions in the intended Organization
- Created an [Apache Spark™ cluster](/data-lab/how-to/create-data-lab/)
- Created an [Apache Spark™ cluster](/apache-spark/how-to/create-spark-cluster/)

## How to manage an Apache Spark™ cluster

Expand All @@ -23,8 +23,8 @@ This page explains how to manage and delete your Apache Spark™ cluster.
2. Click the name of the cluster you want to manage. The **Overview** tab of the cluster displays. From this page, you can:
- Consult the configuration of your cluster.
- View the network information of your cluster.
- [Access the Apache Spark™ UI](/data-lab/how-to/access-spark-ui/) of your cluster.
- [Access the notebook environment](/data-lab/how-to/access-notebook/) of your cluster.
- [Access the Apache Spark™ UI](/apache-spark/how-to/access-spark-ui/) of your cluster.
- [Access the notebook environment](/apache-spark/how-to/access-notebook/) of your cluster.

3. Click the **Settings** tab.

Expand All @@ -37,7 +37,7 @@ This page explains how to manage and delete your Apache Spark™ cluster.
- [Delete your cluster](#how-to-delete-an-apache-sparktm-cluster).

<Message type="note">
Once you have created a cluster, you cannot edit the node type. You must [create a new cluster](/data-lab/how-to/create-data-lab/) instead.
Once you have created a cluster, you cannot edit the node type. You must [create a new cluster](/apache-spark/how-to/create-spark-cluster/) instead.
</Message>

## How to delete an Apache Spark™ cluster
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ import Requirements from '@macros/iam/requirements.mdx'

[Private Networks](/vpc/concepts/#private-networks) allow your Clusters for Apache Spark™ cluster to communicate in an isolated and secure network without needing to be connected to the public internet.

At the moment, Apache Spark™ clusters can only be attached to a Private Network [during their creation](/data-lab/how-to/create-data-lab/), and cannot be detached and reattached to another Private Network afterward.
At the moment, Apache Spark™ clusters can only be attached to a Private Network [during their creation](/apache-spark/how-to/create-spark-cluster/), and cannot be detached and reattached to another Private Network afterward.

For full information about Scaleway Private Networks and VPC, see our [dedicated documentation](/vpc/) and [best practices guide](/vpc/reference-content/getting-most-private-networks/).

Expand Down Expand Up @@ -220,6 +220,6 @@ Your notebook hosted on an Instance is ready to be used over Private Networks.
- `<INSTANCE_PN_IP>` can be found in the **Private Networks** tab of your Instance. Make sure to only copy the IP, and not the `/22` part.
</Message>

8. [Access the Apache Spark™ UI](/data-lab/how-to/access-spark-ui/) of your cluster. The list of completed applications displays. From here, you can inspect the jobs previously started using `spark-submit`.
8. [Access the Apache Spark™ UI](/apache-spark/how-to/access-spark-ui/) of your cluster. The list of completed applications displays. From here, you can inspect the jobs previously started using `spark-submit`.

You successfully run workloads on your cluster from an Instance over a Private Network.
10 changes: 5 additions & 5 deletions pages/data-lab/index.mdx → pages/apache-spark/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ description: Dive into Scaleway Clusters for Apache Spark™ with our quickstart
productName="Clusters for Apache Spark™"
productLogo="dataLab"
description="Clusters for Apache Spark™ is designed to assist data scientists and data engineers perform calculations on a remotely managed Apache Spark infrastructure."
url="/data-lab/quickstart"
url="/apache-spark/quickstart"
label="Clusters for Apache Spark™ Quickstart"
/>

Expand All @@ -19,27 +19,27 @@ description: Dive into Scaleway Clusters for Apache Spark™ with our quickstart
icon="rocket"
description="Learn how to create, use, manage, and delete an Apache Spark™ cluster in a few steps."
label="View Quickstart"
url="/data-lab/quickstart/"
url="/apache-spark/quickstart/"
/>
<SummaryCard
title="Concepts"
icon="info"
description="Core concepts that give you a better understanding of Scaleway Clusters for Apache Spark™."
label="View Concepts"
url="/data-lab/concepts/"
url="/apache-spark/concepts/"
/>
<SummaryCard
title="How-tos"
icon="help-circle-outline"
description="Check our guides to creating, using, and managing Apache Spark™ clusters and their features."
label="View How-tos"
url="/data-lab/how-to/"
url="/apache-spark/how-to/"
/>
</Grid>

## Changelog

<ChangelogList
productName="data-lab"
productName="apache-spark"
numberOfChanges={3}
/>
8 changes: 4 additions & 4 deletions pages/data-lab/menu.ts → pages/apache-spark/menu.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ export const dataLabMenu = {
items: [
{
label: 'Overview',
slug: '../data-lab',
slug: '../apache-spark',
},
{
label: 'Concepts',
Expand All @@ -20,7 +20,7 @@ export const dataLabMenu = {
items: [
{
label: 'Create a Spark™ cluster',
slug: 'create-data-lab',
slug: 'create-spark-cluster',
},
{
label: 'Access the notebook',
Expand All @@ -36,7 +36,7 @@ export const dataLabMenu = {
},
{
label: 'Manage and delete a cluster',
slug: 'manage-delete-data-lab',
slug: 'manage-delete-spark-cluster',
},
],
label: 'How to',
Expand All @@ -48,5 +48,5 @@ export const dataLabMenu = {
},
],
label: 'Clusters for Apache Spark™',
slug: 'data-lab',
slug: 'apache-spark',
}
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ This documentation explains how to quickly create an Apache Spark™ cluster, ac

Once the cluster is created, you are directed to its **Overview** page.

Refer to the [dedicated documentation](/data-lab/how-to/create-data-lab/) for detailed information on how to create a cluster.
Refer to the [dedicated documentation](/apache-spark/how-to/create-spark-cluster/) for detailed information on how to create a cluster.

## How to connect to your cluster's notebook

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ import Requirements from '@macros/iam/requirements.mdx'

- A Scaleway account logged into the [console](https://console.scaleway.com)
- [Owner](/iam/concepts/#owner) status or [IAM permissions](/iam/concepts/#permission) allowing you to perform actions in the intended Organization
- An [Apache Spark™ cluster](/data-lab/how-to/create-data-lab/)
- An [Apache Spark™ cluster](/apache-spark/how-to/create-spark-cluster/)

## Timeout errors

Expand All @@ -27,7 +27,7 @@ The Apache Spark™ cluster has zero worker nodes provisioned and cannot raise a

### Solution

[Edit your cluster configuration](/data-lab/how-to/manage-delete-data-lab/) by provisioning at least one worker node to be able to run calculations with it.
[Edit your cluster configuration](/apache-spark/how-to/manage-delete-spark-cluster/) by provisioning at least one worker node to be able to run calculations with it.

<Message type="note">
You can provision zero worker nodes again to retain you cluster and notebook configurations while minimizing its cost.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,13 +19,13 @@ productIcon: DistributedDataLabProductIcon
<Card
title="Clusters for Apache Spark™ FAQ"
description="General info on Clusters for Apache Spark™."
url="/data-lab/faq/"
url="/apache-spark/faq/"
label="See more"
/>
</Grid>

## Clusters for Apache Spark™ troubleshooting pages

<LinksList>
- [Troubleshooting Clusters for Apache Spark™ execution issues](/data-lab/troubleshooting/cannot-run-data-lab)
- [Troubleshooting Clusters for Apache Spark™ execution issues](/apache-spark/troubleshooting/cannot-run-spark-cluster)
</LinksList>
Loading