Vectice Docs
API Reference (Latest)Vectice WebsiteStart Free Trial
Latest
Latest
  • 🏠Introduction
    • Vectice overview
      • Autolog
      • Next-Gen Autolog [BETA]
      • AskAI
      • Vectice for financial services
  • 🏁Quickstart
    • Getting started
    • Quickstart project
    • Tutorial project
    • FAQ
  • ▶️Demo Center
    • Feature videos
  • 📊Manage AI/ML projects
    • Organize workspaces
      • Create a workspace
      • Workspace Dashboard
    • Organize projects
      • Create a project
      • Project templates best practices
    • Invite colleagues
    • Define phase requirements
    • Collaborate with your team
  • 🚀Log and Manage Assets with Vectice API
    • API cheatsheets
      • Vectice Python API cheatsheet
      • Vectice R API cheatsheet
    • Connect to API
    • Log assets to Vectice
      • Autolog your assets
      • Log datasets
      • Log models
      • Log attachments and notes
      • Log code
      • Log a custom data source
      • Log assets using Vectice IDs
      • Log dataset structure and statistics
      • Log custom metadata in a table format
      • Log MLFLow runs
    • Retrieve assets from app
    • Manage your assets
    • Manage your iteration
    • Preserve your code and asset lineage
  • 🤝Create Model documentation and reports
    • Create model documentation with Vectice Reports
    • Streamline documentation with Macros
    • Auto-document Models and Datasets with AskAI Prompts
    • Document phase outcomes
  • 🗂️Admin Guides
    • Organization management
    • Workspace management
    • Teams management
    • User management
      • User roles and permissions
      • Update a user role in your organization
      • Activate and deactivate users
      • Reset a user's password
    • Manage report templates
  • 🔗Integrations
    • Integrations Overview
    • Integrate Vectice with your data platform
  • 💻IT & Security
    • IT & Security Overview
    • Secure Evaluation Environment Overview
    • Deployment
      • SaaS offering (Multi-Tenant SaaS)
      • Kubernetes self-hosted offering
        • General Architecture & Infrastructure
        • Kubernetes on GCP
          • Appendices
        • Kubernetes on AWS
          • Appendices
        • Kubernetes on Azure
          • Appendices
        • GCP Marketplace deployment
        • On premise
        • Configuration
      • Bring Your Own LLM Guide
    • Data privacy
    • User management
    • SSO management
      • Generic SAML integration
      • Okta SSO integration
    • Security
      • Data storage security
      • Network Security
        • HTTPS communication
        • Reverse proxy
        • CORS/CSRF
        • VPC segregation
      • Sessions
      • Secrets and certificates
      • Audit logs
      • SOC2
      • Security updates
      • Best practices
      • Business continuity
    • Monitoring
      • Installation guide
      • Customizing the deployments
    • Maintenance & upgrades
    • Integrating Vectice Securely
  • ⭐Glossary
    • Concepts
      • Workspaces
      • Projects
        • Setup a project
      • Phases
      • Iterations
        • Iterative development
      • Datasets
        • Dataset resources
        • Dataset properties
        • Dataset lineage and versions
      • Models
      • Reports
  • 🎯Release notes
    • Release notes
  • ↗️References
    • Vectice Python API Reference
    • Vectice R API Cheatsheet
    • Notebooks and code samples
    • Vectice website
Powered by GitBook
On this page
  • Appendix 1: Creating the Cloud SQL instance and the two databases
  • Appendix 2: Creating the Kubernetes Cluster
  • Cluster creation
  • Appendix 3: Create Azure Blob Storage and access
  • Storage account Creation
  • Blob storage container creation
  • Creating the Service Principal
  • Appendix 4: Creating the network links between the SQL Instance and the AKS Cluster
  • Virtual network link creation
  • VNet Peering
  • Appendix 5: Disaster recovery plan
  • Backup strategy
  • Recovery Mechanism

Was this helpful?

  1. IT & Security
  2. Deployment
  3. Kubernetes self-hosted offering
  4. Kubernetes on Azure

Appendices

PreviousKubernetes on AzureNextGCP Marketplace deployment

Last updated 1 month ago

Was this helpful?

Appendix 1: Creating the Cloud SQL instance and the two databases

Here we have the instructions for creating the Cloud SQL Instance with PostgreSQL 13.X and the two databases necessary: keycloak and vectice.

First, go the menu “Azure Database for PostgreSQL flexible servers” and Press “Create”.

You can then choose the resource group, which will also include the AKS Cluster and the Storage Account. Choose PostgreSQL 13. Then select Press Next: Networking >.

We can use Private Access for the Networking part, as it’s more secure.

We’ll have a dedicated VNet for the PostgreSQL instance that will be created along with the instance. It will also create a dedicated Private DNS zone. For the pods to communicate with the SQL instance, we’ll perform two more actions:

  1. Create a Peering between the AKS VNet and the PostgreSQL VNet

  2. Create a Virtual Network Link from the Private DNS Zone to the AKS VNet

You can select the button “Review + Create” and create the SQL Instance.

Then create the two databases (vectice and keycloak) from the menu “Databases”, CREATE DATABASE, create vectice, then redo the operation for keycloak

We will also enable the extension "UNACCENT"

Check in the Settings > Networking menu that Server CA is required, here are the links from offical documentation to retrieve them.

Once downloaded, you need to format them in pem format, using the command below, a default value is proposed in the myvalues file, already encoded in base64.

openssl x509 -inform DER -in certificate_downloaded.crt -out certificate.pem -outform PEM

Appendix 2: Creating the Kubernetes Cluster

Cluster creation

Follow the steps in the console, like in the screenshots below.

Appendix 3: Create Azure Blob Storage and access

Here are the instructions for creating the storage account, the container blob storage, and the service principals to get access to the container. The service principals will have permission to read/write the container blob storage. The Bucket is used for storing asset metadata from Models, Datasets, Notes, and Graphs.

Storage account Creation

First, create a storage account. Go to the “Storage accounts” menu and press Create.

Then fill in the fields and make sure to set it in the same resource group as the AKS Cluster and the Azure PostgreSQL instance in the same Region.

In the Networking part, we select the AKS VNet, so that the content of the storage account can be accessed from the AKS Cluster.

Blob storage container creation

Create the container within the storage account:

Creating the Service Principal

First, create a Service Principal. To do so, go to “Azure Active Directory”, then the “App registration” menu, and press “New registration”:

Then, name the Service Principal and Register it.

After that, create a secret for the Service Principal:

We then retrieve the value of the secret now, because after that, the content of the value will not be visible.

Retrieve the values of the Application (client) ID and Directory (tenant) ID.

Come back to the storage account previously created, in order to grant the Service Principal access to the Container.

Go to the Access Control (IAM) menu, and in the “Role assignment” tab, Click on Add, Add role assignment.

Set the role “Storage Blob Data Contributor”, and click on Next to define the members:

Select the Service Principal created earlier:

Review and assign the role.

Appendix 4: Creating the network links between the SQL Instance and the AKS Cluster

Virtual network link creation

On the Private DNS Zone created along with the SQL instance, go to the “Virtual network links” menu, and press Add:

Add the link between the Private DNS Zone and the AKS VNet, and enable auto-registration:

VNet Peering

Create the VNet Peering between the PostgreSQL VNet and the AKS VNet. To do so, go to the “Virtual Networks” menu and Click on the AKS VNet

Proceed to the peering of the 2 VNets:

Appendix 5: Disaster recovery plan

Backup strategy

As there is no persistent data on the Kubernetes Cluster, no backup of the Cluster is necessary. At the minimum, we recommend a daily Backup of the Azure Blob Storage Container and the Cloud SQL instance.

For the Azure Blob Storage Container, we recommend making a copy of the Azure Blob Storage Container content and placing it in another Azure Blob Storage Container in a folder named with a timestamp to create at each backup.

Recovery Mechanism

Learn more about our and .

The default Cloud SQL instance backup strategy is a daily backup of the whole instance. Backups can also be customized; more information can be found in this .

Depending on the nature of the disaster, recovery solutions might change. In case of an infrastructure issue, please refer to the section; to recreate the defaulting infrastructure elements.

The restoration of Bucket content consists of copying the content of the time-stamped folder described in the Backup strategy section to the application Azure Blob Storage Container on which the helm vectice configuration aims at. The restoration of the Cloud SQL instance consists of restoring the database backup following the .

If the issue requires the creation of a new Kubernetes Cluster, for example, in a new region, please refer to the section: to redeploy the software. Make sure to fill in the values according to your new environment deployment.

💻
Microsoft RSA Root CA 2017
DigiCert Global Root G2
Digicert Global Root CA
Azure documentation
Azure documentation
data storage
backup policies
Provisioning the infrastructure
Application deployment