HXA079 family of hybrid models, combining RWKV recurrent architectures with Transformer-based attention. Designed for efficient long-context.