Entering Infra Information and AI Solution Information
In the process of building and operating an AI Solution, it is essential to configure the infra environment that will host and run the solution. The infra_config.yaml file in the settings folder, located in the same path as the modeling code you wrote, is used for such infra configuration purposes. This file allows you to easily define the necessary infra information to register and manage the AI Solution.
The information set in the infra_config.yaml file is utilized when registering the solution on the AI Conductor platform. By specifying essential infra-related information such as the web server address of the AI Conductor, the AWS cloud region, the name of the workspace, and the Docker image build method, AI solution developers or users can effectively deploy and run the AI Solution they are developing or operating.
Topics
Constructing the Infra Configuration File
Writing the infra_config.yaml File
Write information about the infra where the AI Solution will be registered in the setting/infra_config.yaml file. Below are explanations of key information.
- AIC_URL: Web server address of the AI Conductor
- AWS_KEY_PROFILE: Name of the AWS configure profile that specifies the access & secret key to access S3 and ECR
- BUILD_METHOD: Choose between docker, buildah, or codebuild depending on your development environment. (AWS codebuild performs docker build through cloud infra.)
- REPOSITORY_TAGS: Tags for the ECR repository to include billing information
- CODEBUILD_ENV_COMPUTE_TYPE: Compute type of the build environment in AWS Codebuild when BUILD_METHOD=codebuild
- CODEBUILD_ENV_TYPE: Build environment type to use in AWS Codebuild when BUILD_METHOD=codebuild
- VERSION: Version of the Solution Metadata that links with the Meerkat system
- WORKSPACE_NAME: Workspace allocated by the AI Conductor
Users who use AWS keys instead of the SA (Service Account) method should set up AWS keys in the terminal with the aws configure command as shown below. Write the profile name, such as alo-aws-profile, in AWS_KEY_PROFILE in infra_config.yaml.
aws configure --profile alo-aws-profile
AWS Access Key ID : {write access key}
AWS Secret Access Key : {write secret key}
Default region name : ap-northeast-2
Default output format :
Note: You can refer to the sample in {solution_name}/setting/example_infra_config. Below is an example of setting/infra_config.yaml.
AIC_URI: https://aicond.try-mellerikat.com/
#CLOUD_SERVICE_ACCOUNT:
# TYPE: GCP # AWS
# CREDENTIALS: # for GCP
# KEY_FILE: *.json
# CREDENTIALS: # for AWS
# PROFILE: profile-name
AWS_KEY_PROFILE: aws-key-profile
GCP_CRENDENTIALS: key.json
BUILD_METHOD: docker # docker / buildah / codebuild
CODEBUILD_ENV_COMPUTE_TYPE: BUILD_GENERAL1_SMALL
CODEBUILD_ENV_TYPE: LINUX_CONTAINER
LOGIN_MODE: static
REGION: ap-northeast-2
REPOSITORY_TAGS: # ECR repository tags setting: list of {Key:Value}
- Key: Owner
Value: Mellerikat
- Key: Phase
Value: DEV
VERSION: 1.1
WORKSPACE_NAME: poc-ws
Writing the solution_info.yaml File
Write information about the AI Solution to be registered in the solution_info.yaml file.
The parameters that need to be modified are as follows:
- name: Solution name
- update: Whether to update an existing solution
- overview: Description of the created AI Solution
- support_labeling: Whether to use retraining functionality after deployment
- detail: Additional details needed beyond the overview
- inference: Configuration for the inference environment
- train: Configuration for the training environment
- ui_args: For the type of UI args, you can choose from float, integer, string (or a list in string form), single_selection, or multi_selection. In ui_args_detail, write the required information for each type
name: my-solution-name # Solution name
type: private
update: false # true # Whether to update an existing solution
overview: "" # Solution description
contents_type:
labeling_column_name: null
support_labeling: false
detail:
- content: "sample content001"
title: "sample title001"
- content: "sample content002"
title: "sample title002"
inference: # Inference specifications
cpu: amd64 # CPU type amd64/arm64
gpu: false # Whether to use GPU
only: false # Inference only mode
datatype: table # table or image
# datatype:
# result: table # table or image
train: # Training specifications
gpu: false # Whether to use GPU
datatype: table # table or image
### ui_args 선언 부 ###
ui_args:
function:
readiness:
default: ''
description: Enter the column names of the training target x in the dataframe, separated by commas.
name: x_columns
range:
- 1
- 100000
type: string
train:
default: auto
description: Select an evaluation metric to choose the model during HPO.
name: auto
selectable:
- auto
- accuracy
- f1
- recall
- precision
- mse
- r2
- mae
- rmse
type: single_selection
update: false