Academic Credit Support Request β University Student Building a Korean Public-Service AI Platform with K-EXAONE
Dear K-EXAONE team,
First of all, thank you for making K-EXAONE available to the community. Your work on building a world-class Korean-native LLM is truly inspiring, and it has been a privilege to build on top of it.
I am a computer science undergraduate student in South Korea. I am currently developing KOSMOS, an open-source conversational AI platform that aims to turn Korea's 5,000+ fragmented government APIs (data.go.kr) into a single natural-language interface for citizens.
K-EXAONE (236B-A23B) serves as the core LLM for this platform, accessed through the FriendliAI Serverless API. I am reaching out to humbly ask whether there might be any form of academic credit or compute support available for student researchers using K-EXAONE in public-interest projects. I completely understand if this is not something that can be accommodated, and I appreciate you even taking the time to read this.
Why K-EXAONE
I would like to share why K-EXAONE was the natural and only choice for KOSMOS:
- Native Korean language capability β Government APIs return Korean-only responses with inconsistent schemas across 5,000+ endpoints. K-EXAONE's exceptional Korean proficiency is essential for accurate intent analysis and citizen-facing answers. No other model matches this level of Korean understanding.
- MoE cost efficiency β The 236B total / 23B active parameter architecture delivers strong reasoning at a fraction of dense-model inference cost. For a self-funded student project calling LLMs in a multi-turn tool loop, this efficiency makes the project feasible at all.
- Tool-use competency β The platform runs an async tool loop where the LLM must reliably select, parameterize, and chain public API calls across government ministries. K-EXAONE has performed remarkably well in this demanding scenario.
- Alignment with Korean public AI goals β Korea's AI Action Plan (2026-2028) emphasizes public-sector AI transformation. Building this platform on a Korean-made LLM feels right β KOSMOS validates principles 8 and 9 of that national plan.
About the Project
| Repository | github.com/umyunsang/KOSMOS (public, Apache-2.0) |
| Stack | Python 3.12+ / httpx / Pydantic v2 / pytest |
| Architecture | 6-layer design: Query Engine, Tool System, Permission Pipeline, Agent Swarms, Context Assembly, Error Recovery |
| Target scenarios | Route safety (KOROAD + KMA), emergency care (119 + HIRA), benefit applications (MOHW + Gov24), multi-ministry coordination |
| Conference target | KSC 2026 (Korea Software Congress) paper submission |
The platform is under active development with a spec-driven workflow, and all source code is publicly available for review.
A Humble Request
As a university student, I am currently self-funding all API costs. FriendliAI Serverless credits for K-EXAONE inference are the primary expense, and as the project grows, sustaining this is becoming a challenge. If any of the following were possible, I would be truly grateful:
- Academic credit allocation for continued development and experimentation
- Guidance on any existing student or researcher support programs I may not be aware of
- A referral to the appropriate contact if this is not the right channel
In return, I would be more than happy to provide a detailed technical report on K-EXAONE's performance in the Korean public-service domain, share benchmark results, or properly acknowledge LG AI Research's contribution in any resulting publications.
What This Project Could Offer Back
I hope this project can be a small contribution back to the K-EXAONE ecosystem as a real-world validation in a meaningful domain:
- Multi-turn Korean tool-use across heterogeneous government APIs
- Permission-gated PII handling under Korea's PIPA (Personal Information Protection Act)
- Cost-optimized prompt caching in long citizen sessions (128K context window)
- MoE routing behavior analysis under real Korean public-service workloads
All findings will be shared openly with the community.
Thank you so much for your time, and once again, thank you for building K-EXAONE. It is genuinely enabling work that would not be possible otherwise.
With gratitude,
Um Yunsang
Computer Science, Undergraduate
GitHub: github.com/umyunsang