Talent Acquisition + AI

Joshua Lollman

Making AI work for recruiting.

Focused on prompts, workflows, and practical tooling

All views are my own. Examples are generalized or anonymized and do not reflect any single employer's confidential data, systems, or metrics.

Joshua Lollman

I work at the point where recruiting, workflow design, and AI overlap. I've led teams, carried reqs myself, and spent the last few years building the systems and practices that help recruiters and hiring managers work without friction. Most of what I do now focuses on how AI fits into real day-to-day recruiting work—what it helps with, what it doesn't, and how to make it usable for people who already have a full plate.

This site collects the tools, references, and explanations I use with teams so the work stays practical and grounded.

All views are my own. Examples are generalized or anonymized and do not reflect any single employer's confidential data, systems, or metrics.

Current focus and prior roles

Current (2024-Present)

My focus is AI enablement and talent acquisition operations in large-scale recruiting environments. I work with recruiting teams on AI adoption, workflow design, and the processes that keep hiring moving.

I focus on the kinds of recruiting tech initiatives common in large organizations (career sites, CMS, sourcing tools), emphasizing clear requirements, usability, and responsible adoption. I also develop prompt frameworks and evaluation approaches for screening use cases using generalized examples to test output quality and consistency.

I focus on AI governance, audit readiness, and responsible-use practices, with an emphasis on clear documentation and reviewability.

Before (2021-2024)

I managed a recruiting team focused on high-volume frontline hiring. Over that period we made sustained hiring progress with improved fill rates and a meaningful reduction in early attrition through systematic changes to sourcing, screening, and new-hire support.

Along the way, I built trackers, audit checklists, knowledge libraries, and process documentation—the kinds of operational backbone that help recruiting teams scale. I also contributed to process improvement efforts focused on testing and refining changes in recruiting workflows.

Earlier (2018-2021)

Before moving into leadership, I worked as a full-cycle recruiter covering everything from high-volume customer service roles through director-level searches. I was brought in to transition from external support, steady relationships with hiring managers, and expand coverage as the approach proved out.

I originally came from staffing—technical recruiting under MSP agreements, business development, and client management—where delivery commitments and SLAs weren't negotiable. That's where I learned how recruiting actually feels when you're responsible for both the numbers and the relationships.

Where domain expertise meets AI fluency

AI Strategy & Enablement

Workshops, coaching, documentation. Teaching recruiters and leaders how to evaluate AI tools, use them effectively, and understand the risks. The hard part isn't the technology—it's getting people to engage with it honestly.

Prompt Engineering

Prompts for outreach, JDs, interview guides, screening. The hard part isn't the syntax—it's knowing what good output looks like in recruiting, then working backward from there.

Workflow Design

Map recruiting processes before automating them. Understanding what’s manual, what’s cognitive load, and what’s actually broken helps determine where AI fits and where it doesn’t.

Automation & Tooling

Data cleanup, content generation, workflow utilities. I build these using AI-assisted development—not a developer, but I know the problem well enough to spec it precisely and test until it works.

AI Tool Partnership

Working with engineering teams on recruitment AI. I provide the recruiter perspective—pain points, test scenarios, output validation. They build it; I make sure it solves the right problem.

AI Compliance Awareness

Tracking US hiring AI legislation—CA, IL, NY, and what's emerging elsewhere. Not a lawyer, but someone needs to stay current and help the team understand where the risks are.

Work examples in practice

Real problems solved at scale. These aren't sales deck case studies—they're representative, anonymized examples of the kinds of systems and frameworks I design.

AI Adoption Framework

The problem: Organizations introduce AI tools but see low usage. Recruiters don't know when or how to use them, and stakeholders struggle to assess impact or identify where support is needed.

What I built: A representative adoption framework—a set of workflow-specific tools (JD generation, sourcing strategy, screening guides, data analysis), structured training, usage signals, and a measurement framework. Designed for large, distributed recruiting teams.

Approach: Start with recruiting pain points, not cool technology. Build tools for specific workflows, train on realistic scenarios, and track adoption qualitatively. Design for local context—language, time zones, market differences—without hardcoding a single organization's process.

Outcome: The framework is intended to drive consistent usage across distributed teams. Impact can be assessed through workflow feedback and reductions in rework while keeping human review. Designed to be adaptable to other HR functions.

What made it work: Treat it as change management, not a feature release. The technology is the easy part.

Compliance Audit Automation

The problem: Manual job description compliance reviews often can't keep up with posting volume. Risky language slips through, visibility is limited, and cleanup becomes reactive.

What I built: A lightweight, illustrative audit prototype using scripting, rule-based checks, and simple dashboards. It shows how structured records could be scanned for potential issues, grouped by category, and surfaced with correction guidance.

Approach: Designed for fast build and easy handoff—documented setup, minimal technical overhead. Rules are separated from code so non-technical stakeholders can update compliance requirements.

Outcome: Intended to reduce manual spot-checking and improve compliance visibility. Emphasizes auditability for documentation and encourages better input quality for downstream tools.

Lesson learned: Sometimes the best solution isn't elegant—it's the one you can deliver quickly and maintain without a developer.

How I work

I'm not a traditional developer, but I break problems down clearly and use AI-assisted or low-code tools to build what teams need.

Complex projects go through engineering. Smaller tools I build myself using AI-assisted workflows. Either way, the value is the same: recruiting problems solved by someone who's worked them firsthand.

Let's talk

Open to connecting with others in the space.