Hazrat Ali
AI products & automation2026live

hazratali.me — this portfolio, AI-reframed

Yes — the website you're reading right now. A portfolio with a Claude-powered hero reframer and a floating Ask-me chat. The self-reference is the point.

Role
Design, frontend, AI integration, deployment — all of it
Year
2026

Problem

Portfolio sites in 2026 are mostly static grids. A senior developer needs a first impression that proves craft on the spot — not a case study that says trust me, I did an AI build once. The page itself should be the evidence.

Approach

Here's the fun part: this case study is on the site it describes. You're reading it right now on the thing it's about. Two signature AI moments are baked into the page itself:

  1. Hero reframer. Type what you're building into the hero input and Claude rewrites the subhead to fit. Streams token by token from a Next.js route handler, with a typewriter buffer that smooths proxy bursts so the reveal feels like handwriting, not chunks arriving over the wire.
  2. Ask me anything. The floating button in the corner. A Claude chat trained on my profile — rates, timelines, availability, how I build. Visitors who skip the hero still get real answers. Inline contact handoff if they want a human reply.

Everything else is editorial restraint. Ember palette, Instrument Serif display, mono labels. Reduced-motion friendly. Per-line headline rise, scroll-revealed sections, warm radial ambience — enough motion to feel alive, not so much it looks like a SaaS landing page.

Technical deep-dive

  • claude-haiku-4-5 via the Anthropic SDK for both the reframer and the chat. Cheap per token, fast, good enough — the hero is a one-sentence rewrite, the chat is profile Q&A, neither needs Sonnet.
  • Streaming over SSE through Next.js App Router route handlers. AbortController on the client so navigation cancels in-flight streams.
  • IP-based rate limits on both endpoints. System prompts and knowledge live in files in the repo, not in a database — every model response is traceable to a git revision.
  • Contact handoff goes through Resend. The chat pre-fills the visitor's last question as reply context so I'm never guessing what they asked.
  • Velite pipeline for MDX case studies (including this one). Dynamic OG images per case. Person + Service JSON-LD on the homepage.

Outcome

Shipped it, hosted it myself, and made the AI features the filter — not the garnish. If you've read this far, the reframer and the chat have already done their job.

Stack & handoff

Next.js 16 + React 19 + Tailwind v4. Self-hosted on Hetzner via Coolify — the same stack I sell. AI through the Claude API. Source is private but the patterns are in every case study on this site.

Stack

Next.js 16React 19TypeScriptTailwind v4motionAnthropic SDKVeliteResend