π§ The Imaginator: Magnum-72B-Career-Strategist
- Developed by: jeff-calderon
- Base Model: unsloth/Qwen2.5-72B-Instruct-bnb-4bit
- Finetuned from model: unsloth/Qwen2.5-72B-Instruct-bnb-4bit
- Fine-tuning Framework: Unsloth / QLoRA
β οΈ License & Usage Warning
License: Tongyi Qianwen License Agreement (Research Only / Non-Commercial)
This model is derived from Qwen-72B. Users must comply with the original Alibaba Cloud Tongyi Qianwen License Agreement. This model is intended for research and educational purposes only.
π― The Vision
We set out to build **"The Imaginator"**βnot just a generic resume writer, but a high-level Career Strategist.
Most AI resume tools simply fix grammar. The Imaginator is designed to perform cognitive reasoning: it takes "lazy notes" or an "outdated resume," analyzes a specific target job (e.g., Java Developer pivoting to DevOps), and strategically reframes the candidate's experience to bridge skill gaps without fabricating history.
ποΈ The "Trinity" Dataset Strategy
To achieve this level of reasoning, we rejected standard freelance datasets (which often sound like sales pitches). Instead, we engineered a custom "Trinity Dataset" of 7,417 high-quality records via a local data factory on an RTX 4080:
1. Type A: The Stylist (Tone & Impact)
- Goal: Master professional, metric-driven business English.
- Input: Weak, passive bullet points.
- Output: Powerful "STAR" method achievements (Situation, Task, Action, Result).
- Source: Mined 2,000 real resume bullets and utilized Grok to inject industry-standard metrics.
2. Type B: The Strategist (Logic & Pivoting)
- Goal: Strategic Reframing.
- Input: A complex JSON payload containing Candidate Context + Target Job + Identified Skill Gaps.
- Output: A rewritten experience section that "bridges the gap" using transferable skills.
- Method: Simulated 1,000 career pivot scenarios (e.g., Frontend Dev $\to$ Full Stack) using Perplexity/Grok to ensure market accuracy.
- Safety: Rigorously filtered to ensure the model never invents fake job titles or promotions.
3. Type C: The Creator (Synthesis from Chaos)
- Goal: Structuring unstructured data.
- Input: "Lazy" user brain dumps (lowercase, no formatting, typos).
- Output: Fully formatted, perfectly structured resume sections.
- Source: We used a "Ruiner Script" on 3,000 high-quality resumes to reverse-engineer them into lazy text messages, teaching the model how to reconstruct them.
π Capabilities & Performance
This model was fine-tuned on an A100 GPU using Unsloth. It excels at:
- Format Synthesis: Turning raw text into polished documents.
- Strategic Pivoting: Rewriting experience to target specific roles.
- Hallucination Control: Trained specifically not to invent fake job titles to fill gaps.
Inference Example
Input (Lazy User):
"i worked at amazon as a warehouse guy... hit rates... trained new people"
Imaginator Output:
Logistics Associate | Amazon Packed products in a timely manner and consistently met or exceeded productivity rates. Trained and mentored new employees on safety protocols and packing procedures, improving team efficiency.
π» How to Use (Unsloth)
from unsloth import FastLanguageModel
import torch
max_seq_length = 8192
dtype = None
load_in_4bit = True
model, tokenizer = FastLanguageModel.from_pretrained(
model_name = "jeff-calderon/Magnum-72B-Imaginator-LoRA", # Your model name here
max_seq_length = max_seq_length,
dtype = dtype,
load_in_4bit = load_in_4bit,
)
FastLanguageModel.for_inference(model)
messages = [
{"role": "system", "content": "You are a professional resume writer. Convert the user's raw notes into a polished Experience section."},
{"role": "user", "content": "managed a team of 5 sales guys. we hit 1m in revenue."}
]
inputs = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt").to("cuda")
outputs = model.generate(input_ids=inputs, max_new_tokens=512, use_cache=True, temperature=0.3)
print(tokenizer.batch_decode(outputs[0], skip_special_tokens=True))
Model tree for jeff-calderon/Magnum-72B-Imaginator-LoRA
Base model
Qwen/Qwen2.5-72B