Skip to main content
Version: Next

Quick Run

Updated 2025.02.20



This page guides you through quickly installing the latest version of ALO, setting up a sample AI Solution (Titanic), and registering the AI Solution without modifying the code.

Topics

 



step1: Start ALO

The process of setting up a virtual environment and installing ALO-v3.

step1-1: Set ALO Execution Environment

Below is an example of an Anaconda virtual environment. You can set up a virtual environment capable of running Python 3.10 with pyenv or other methods.

conda init bash ## Initialize conda environment
exec bash ## Restart bash
# conda create --prefix /home/jovyan/testenv python=3.10 ## Method to install a persistent virtual environment when running Jupyter from a docker
conda create -n {virtual_environment_name} python=3.10 ## 3.10 is mandatory
conda activate {virtual_environment_name} ## If conda activate does not execute, run exec bash

step1-2: Install ALO

Run the following command in your desired virtual environment.

pip install mellerikat-alo


step2: Develop AI Solution

If you need to understand the AI Solution running in ALO, you can check by installing the 'titanic' sample solution.

step2-1: Install and Run Titanic

# Install the titanic example in the current directory
alo example

After successful execution, the following folders and files will be generated. The created folders and files are essential components for creating and registering an AI Solution.

Current directory/
├── experimental_plan.yaml # Information required for creating the AI Solution (mapped with titanic.py)
├── inference_dataset # Data for model inference
│ └── inference_dataset.csv
├── setting # Folder for infrastructure settings
│ ├── infra_config.yaml
│ ├── sagemaker_config.yaml
│ └── solution_info.yaml
├── titanic.py # Python file created for Quick Run
└── train_dataset # Data for model training
└── train.csv
alo run  # ALO run

After alo run executes successfully, the following two folders will be added.

Current directory/
├── inference_artifact # Path specified by the artifact_uri in the inference: section of experimental_plan.yaml. Stores information on the inference pipeline.
│ ├── inference_artifacts.zip # A zip file containing output (result.csv), log (pipeline.log, process.log), and score (inference_summary.yaml).
│ ├── pipeline.log # Execution log of titanic.py.
│ └── process.log # Overall log of the alo execution.

├── train_artifact # Path specified by the artifact_uri in the train: section of experimental_plan.yaml. Stores information on the train pipeline.
│ ├── model.tar.gz # The trained model stored in a gz file.
│ ├── pipeline.log # Execution log of titanic.py.
│ ├── process.log # Overall log of the alo execution.
│ └── train_artifacts.tar.gz # Contains log (pipeline.log, process.log) and score (inference_summary.yaml).


step3: Register Solution

Follow these steps to register an AI Solution:

  1. Write the information of the infrastructure to register the solution on the settings/infra_config.yaml file.
  2. Write the information of the solution to register on the settings/solution_info.yaml file.
  3. Execute solution registration.
  4. Test AI training.

step3-1: Write infra_config.yaml File

Write the information of the environment to register the AI Solution in the ./{project_home}/setting/infra_setup.yaml file.

AIC_URI: https://aicond.meerkat-dev.com/
#CLOUD_SERVICE_ACCOUNT:
# TYPE: GCP # AWS
# CREDENTIALS: # for GCP
# KEY_FILE: *.json
# CREDENTIALS: # for AWS
# PROFILE: profile-name
AWS_KEY_PROFILE: mmeerkat-dev
GCP_CRENDENTIALS: key.json
BUILD_METHOD: codebuild # docker / buildah / codebuild
CODEBUILD_ENV_COMPUTE_TYPE: BUILD_GENERAL1_SMALL
CODEBUILD_ENV_TYPE: LINUX_CONTAINER
LOGIN_MODE: static
REGION: ap-northeast-2
REPOSITORY_TAGS: # ECR repository tags setting: List of {Key:Value}
- Key: Owner
Value: Mellerikat
- Key: Phase
Value: DEV
VERSION: 1.1
WORKSPACE_NAME: common-ws

step3-2: Write solution_info.yaml File

Log into the register-ai-solution.ipynb Jupyter Notebook with the provided account, then write and execute the contents of the below cell. The key 'solution_name' is crucial and requires the name of the AI Solution to register to be an all lowercase combination of letters and dashes.

contents_type:
labeling_column_name: null
support_labeling: false
detail:
- content: content_1
title: title_1
inference:
cpu: amd64
datatype: table
gpu: false
only: false
name: test-0207-1
overview: test_overview
train:
datatype: table
gpu: false
type: private
update: false

If you want to use the re-training feature after running the operation, change support_labeling to True in contents_type. Use detail to add more information about the AI Solution if the overview is insufficient. Define the specifications for inference in inference. Change gpu to True if you want to use a GPU for inference. Lastly, set update to True only if you are updating an existing registered AI Solution.

step3-3: Execute AI Solution Registration

Run the following CLI command in the current directory.

alo register

If the above processes execute successfully, it means the AI Solution has been registered successfully. For more detailed methods of creating AI Solutions, please refer to the next topics.