Junior Full-Stack Engineer - LLM-Powered Products

Valsoft Corporation
Full-time
On-site

Established in Canada in 2015, Valsoft has grown to a global portfolio of 118+ companies, acquiring and developing vertical market software companies, enabling each business to deliver the best mission-critical solutions for customers in their respective industries. A key tenet of Valsoft’s philosophy is to invest in well-established businesses and foster an entrepreneurial environment that molds companies into leaders in their respective industries. Valsoft looks to buy, hold and create value through long-term partnerships with existing management. 

Requirements

What You’ll Do

  • Build user-facing features in React/Next.js and hook them to back-end APIs you write in TypeScript (Node) or Python (FastAPI).
  • Craft and version prompts; experiment with tools like LangChain to chain calls and log outputs for review.
  • Use AI pair-programming tools (Cursor, Windsurf, Copilot, etc.) to speed up coding, refactors, and test generation.
  • Spin up deployments via Vercel / Fly.io or GitHub Actions, debug when things inevitably break, and celebrate the green check-mark.
  • Join product brainstorms—your side-project instincts are welcome.
  • Keep an eye on basic metrics (load time, error rate) and squash bugs before users tweet them.

Why You’ll Love It

  • Ship real stuff fast – small team, weekly releases, instant user feedback.
  • Play with cutting-edge toys – OpenAI, Anthropic, or custom LLM endpoints, vector stores, and AI code editors (Cursor, Windsurf, Copilot).
  • Mentorship that matters – direct pairing with a senior lead; you’ll see how product decisions are made, not just tickets.
  • Clear growth path – successful product launch will lead to a clear path to mid-level within ~18 months. What You’ll Do
  • Build user-facing features in React/Next.js and hook them to back-end APIs you write in TypeScript (Node) or Python (FastAPI).
  • Craft and version prompts; experiment with tools like LangChain to chain calls and log outputs for review.
  • Use AI pair-programming tools (Cursor, Windsurf, Copilot, etc.) to speed up coding, refactors, and test generation.
  • Spin up deployments via Vercel / Fly.io or GitHub Actions, debug when things inevitably break, and celebrate the green check-mark.
  • Join product brainstorms—your side-project instincts are welcome.
  • Keep an eye on basic metrics (load time, error rate) and squash bugs before users tweet them.

Must-Haves

  • Solid grasp of JavaScript/TypeScript and React plus comfort in either Node or Python back-end work.
  • Comfort with AI code editors—show us a commit, video, or repo where you actually used one.
  • Basic Git workflows: branch → PR → merge, and you know why CI turns the build red. • Familiarity with any deploy flow (Vercel button, Fly.io, Heroku, GitHub Actions, etc.).
  • Curiosity for how products delight users, not just how code compiles.
  • Proven fluency in working with REST/GraphQL APIs—comfortable crafting and parsing JSON payloads, debugging requests with Postman/Insomnia or browser dev-tools, and handling error codes/retries.
  • Hands-on experience iterating prompts in tools like ChatGPT, Cursor, or Windsurf. You can show how prompt tweaks changed an LLM’s output to match a specific product need.

Nice-to-Haves

  • LangChain, LlamaIndex, or any RAG experiment on your GitHub.
  • Vector database dabbling (pgvector, Pinecone).
  • A side project people outside your family have used.

Our Stack

React 18 • Next.js • Node 20 • FastAPI • OpenAI & Anthropic APIs • Postgres + pgvector • Vercel • Fly.io • GitHub Actions • Cursor & Windsurf for daily coding

Closing date for this role is 18th August 2025