본문으로 건너뛰기
버전: docs v25.02

Release Notes

v1.2.0

Aug 14, 2025

Features

Dedicated EC2 Instance per Pod

  • Each Pod now exclusively utilizes a single EC2 instance.
  • This ensures improved resource isolation, performance consistency, and simplified infrastructure management.

Custom Entry Point Configuration

  • You can now define a custom entry point when deploying AI Pack, allowing for more flexible and tailored deployment setups.

Chunk-Based Streaming for Registration

  • Registration now supports chunk-based streaming, significantly reducing memory usage during the process.
  • This enhancement improves stability and scalability when handling large solution files.

Exception Handling & Rollback During Pod Creation

  • Improved error handling has been added to the Pod creation process.
  • If a failure occurs (e.g., due to issues in installation scripts), the system now performs a rollback and ensures that AI PACK remains in its latest state.

Dependencies

  • AI Conductor: v2.2.3


v1.1.0

Jul 21, 2025

Features

Namespace Selection for AI Pack Deployment

  • Users can now choose the target namespace when deploying an AI Pack.
  • AI Packs can be deployed either to their own dedicated namespace or to the shared workspace namespace, providing more flexibility in environment configuration and resource isolation.

Optional Namespace Deletion on AI Pack Undeploy

  • A new option has been added to control whether the namespace should be deleted when undeploying an AI Pack.
  • This is especially useful in environments where ALD does not have permission to delete namespaces directly, allowing administrators to manage namespace cleanup behavior via configuration.

Dependencies

  • AI Conductor: v2.2.2


v1.0.0

Jun 27, 2025

Features

Introduction of AI Logic Deployer (ALD)

  • The Service API developed using LLM technology has been successfully registered, deployed, and integrated into the AI Conductor platform as an LLM solution. This enables seamless service implementation and management within the AI Conductor ecosystem.
  • The LLM solution is packaged using Helm and deployed as an AI Pack within the Kubernetes Istio service mesh. It is fully observable through integrated monitoring via the o11y stack.

Authentication

  • Authentication and state management conform to the standards and mechanisms provided by AI Conductor.

End-to-End Solution Lifecycle Management

  • Provides the functionality to manage the entire lifecycle of a Service API.

Official Python 3.12 Based

  • The system is based on Python 3.12, allowing developers to leverage the latest features and performance improvements of the language.

Dependencies

  • AI Conductor: v2.2.0