Land your next role
without the noise.

Scout AI uses intelligent agents to discover, match, and rank job opportunities specifically for your profile. No more endless scrolling.

Dashboard Preview
Job Search Chaos Collage

Searching manually
is the old way.

01

Board Hopper Burnout

Manually scouring boards like LinkedIn and Indeed is a recipe for exhaustion. Scout aggregates every lead into a single, automated pipeline.

02

Hidden Misalignment

Applying with generic templates often leads to silent rejection. We use deep semantic analysis to ensure your profile actually resonates.

03

The Keyword Guessing Game

Stop blindly tweaking your resume for ATS bots. Our agents provide transparent matching scores so you know exactly where you stand.

Designed for Clarity

Supercharge your job search with powerful automation.

📁

Multiple Resumes

Match different profiles to multiple career paths easily.

🌐

Multi-Source Scraping

Monitor LinkedIn, Indeed, and more in parallel.

⏲️

Scheduled Pipelines

Set up automated, recurring runs that keep your pipeline full.

📧

Direct Mail Matches

Best opportunities landed straight in your inbox with digests.

Technical Deep Dive

Built with production-style architecture, explained in plain language.

How one pipeline run works

01

Discover Jobs

Scrapers collect listings from LinkedIn, Indeed, Glassdoor, Reddit, and custom URLs. Duplicate links are removed early.

02

Parse + Structure

An LLM converts raw pages into structured fields: title, company, skills, requirements, responsibilities, and benefits.

03

Semantic Matching

Resume sections are embedded as vectors. Each job is compared against all resume chunks to find evidence-level fit.

04

Rank Opportunities

Final score combines match quality, posting freshness, and source trust signals to prioritize realistic opportunities.

05

Generate Outreach + Notify

For jobs with contact channels, concise outreach drafts are generated and sent in a digest email with diagnostics.

Architecture at a glance

  • Orchestration: LangGraph controls stage-to-stage transitions and retry paths.
  • Async execution: Celery runs pipeline jobs in background workers.
  • API layer: FastAPI handles auth, settings, run triggers, cancellation, and job retrieval.
  • Relational data: PostgreSQL stores users, settings, run history, and ranked job records.
  • Vector retrieval: Qdrant stores resume chunk embeddings for semantic evidence matching.
  • Scheduling: APScheduler triggers periodic runs and recovers missed schedules after downtime.

Reliability and quality controls

  • Dedup layers: URL dedupe, seen-job filtering, and post-parse dedupe reduce noisy repeats.
  • Adaptive freshness: If fresh jobs are too few, discovery retries with broader recency windows.
  • LLM failover: Router supports multi-provider fallback, rate limits, token caps, and cached responses.
  • Scoring transparency: Match scores come from chunk-level evidence, section weighting, and skill overlap.
  • Safety: Resume context is sanitized before generation to reduce prompt-injection-style patterns.
  • Operational tracking: Each run stores status, timings, counts, and errors for review.

Quick Glossary

Agentic Pipeline

A workflow where each stage has one clear responsibility and passes results to the next stage.

Embedding

A numeric representation of text that enables semantic comparison beyond exact keyword matching.

Vector Search

Fast retrieval of the most semantically similar resume chunks for a given job description.

Circuit Breaker

A protection pattern that temporarily stops calling an unstable provider until cooldown ends.

Background Worker

A separate process that runs long tasks asynchronously so the web app remains responsive.

Composite Ranking

A final score made from multiple factors so one weak signal does not dominate job ordering.

Ready in minutes

Get started with Docker or local setup.

GitHub: Ha4sh-447/Scout-ai
# Clone the repository
git clone https://github.com/Ha4sh-447/Scout-ai.git
cd Scout-ai

# Launch setup script (recommended: Docker)
python setup.py

# Or see docs for local setup without Docker