Pillar Labs serves as the internal R&D and innovation engine behind some of the most technically advanced human data annotation in the world. We sit at the intersection of human expertise and artificial intelligence, developing cutting-edge systems, methodologies, and tools that push the boundaries of agentic AI training and evaluation data. Our mission is to enable models to think, reason, and act with greater autonomy, accuracy, and depth of understanding.
Project Specialist - Generalist Annotation
Location
United States + 44 moreAll locations: United States, United Kingdom, Germany, France, Estonia, Portugal, Hungary, Poland, Ukraine, Romania, Bulgaria, Czech Republic, Slovakia, Belarus, Moldova, Republic Of, Sweden, Greece, Belgium, Italy, Ireland, Switzerland, Netherlands, Finland, Malta, Denmark, Lithuania, Croatia, Spain, Austria, Bosnia And Herzegovina, Iceland, Luxembourg, Macedonia, The Former Yugoslav Republic Of, Montenegro, Norway, Serbia, Slovenia, Albania, Cyprus, Latvia, Monaco, Canada, United Arab Emirates
Posted
6 days ago
Salary
Not specified
Seniority
Mid Level
No structured requirement data.
Job Description
Role Description
We’re looking for a Project Specialist — project expert + reviewer of reviewers — to help uphold and continuously improve the quality of our annotation projects. In this role, you’ll act as a subject matter expert across multiple initiatives, balancing hands-on quality review with process optimization and annotator support. You’ll collaborate closely with Project Managers and the core team to refine standards, review reviewer performance, and ensure our annotation work consistently meets the highest bar for clarity, consistency, and precision.
You’ll play a key role in maintaining a healthy and well-supported annotator community by triaging issues, answering questions, and enforcing communication and conduct guidelines across our internal platforms.
Primary Responsibilities
-
Project Expertise & Quality Management (75–80%)
- Serve as a subject matter expert for assigned projects, maintaining deep familiarity with annotation guidelines, tools, and workflows.
- Collaborate with the core team to refine and clarify annotation standards and processes.
- Review annotations and evaluate reviewer performance, providing structured, actionable feedback.
- Identify common errors, inconsistencies, or points of confusion — and recommend or implement process improvements.
- Support updates to annotation instructions, training materials, and onboarding guides to reflect evolving project needs and lessons learned.
- Monitor project metrics and contribute to internal quality discussions or calibration sessions.
-
Annotator-Facing Issue Management, Community Support & Moderation (20–25%)
- Triage and respond to questions, reports, and discussions in internal forums or communication channels.
- Escalate complex or high-impact issues to the appropriate project or operations leads.
- Enforce community and communication guidelines, including moderating discussions and addressing conduct violations when necessary.
- Collect and synthesize annotator feedback to identify recurring themes, improvement opportunities, or emerging risks.
- Collaborate with Project Managers to ensure feedback loops are closed and community needs are addressed promptly and transparently.
Qualifications
- Have 2+ years of experience in annotation, review, copy editing, or content moderation.
- Have 1+ years working in a senior annotator or reviewer role, or have 1+ years of experience operationally supporting projects.
- Have an overlap with UK/European business hours and can work an alternate schedule with 1 weekend day.
- Can consistently catch 90%+ of common errors and inconsistencies in annotated data.
- Thrive on process improvement, spotting patterns, and suggesting changes that save time or reduce errors.
- Are comfortable working in fast-changing project environments with multiple stakeholders. Previous start-up experience preferred.
- Have excellent written communication skills; able to provide clear, respectful, and actionable feedback.
- Are an empathetic communicator who values community, well-being, and fairness.
Company Description
Pillar Labs serves as the internal R&D and innovation engine behind some of the most technically advanced human data annotation in the world. We sit at the intersection of human expertise and artificial intelligence, developing cutting-edge systems, methodologies, and tools that push the boundaries of agentic AI training and evaluation data. Our mission is to enable models to think, reason, and act with greater autonomy, accuracy, and depth of understanding.
Job Requirements
- Have 2+ years of experience in annotation, review, copy editing, or content moderation.
- Have 1+ years working in a senior annotator or reviewer role, or have 1+ years of experience operationally supporting projects.
- Have an overlap with UK/European business hours and can work an alternate schedule with 1 weekend day.
- Can consistently catch 90%+ of common errors and inconsistencies in annotated data.
- Thrive on process improvement, spotting patterns, and suggesting changes that save time or reduce errors.
- Are comfortable working in fast-changing project environments with multiple stakeholders. Previous start-up experience preferred.
- Have excellent written communication skills; able to provide clear, respectful, and actionable feedback.
- Are an empathetic communicator who values community, well-being, and fairness.
Related Guides
Related Categories
Related Job Pages
More QA Engineer Jobs
Director, Clinical Quality Assurance
Headlands ResearchHeadlands Research is improving patients’ lives by advancing innovative medical therapies.
Director of Quality Assurance ensuring high-quality data in clinical trials
Clinical Quality Assurance Coordinator (31677)
IME RESOURCES LLCExamWorks is a leading provider of innovative healthcare services including independent medical examinations, peer reviews, bill reviews, Medicare compliance, case management, record retrieval, document management, and related services. Clients include property and casualty insurance carriers, law firms, third-party claim administrators, and government agencies. Services confirm the veracity of claims by sick or injured individuals under automotive, disability, liability, and workers' compensation insurance coverages.
The coordinator will perform quality assurance reviews on peer review reports, correspondences, and supplemental reviews to ensure they meet quality, accuracy, and compliance standards with client requirements and regulatory guidelines. This includes verifying evidence-based rationales, adherence to client instructions, proper citation usage, and accurate documentation of specialty board reviews.
Technical Manual Developer
BAE SystemsThe London, England, United Kingdom-based BAE Systems is the world’s preeminent provider of defense, security, and aerospace solutions. The company’s produc
As a Technical Manual Developer, you will write technical documents like manuals and specifications based on collaboration with SMEs, ensuring clarity and adherence to standards.
Testing Manager
FedWriters, Inc.FWI is an Equal Opportunity Employer, including disability/vets. At FWI, we place the highest importance on creating an exceptional employee experience. You'll have opportunities to achieve your career aspirations through internal promotions, professional development, and other recognition and rewards programs. Pay Range: Negotiable
FWI is building a team to provide Program Management Office (PMO) Support Services for the Defense Agencies Initiative (DAI), delivering comprehensive end-to-end testing and evaluation services for the DAI Oracle E-Business Suite (EBS 12.2.x) across both on-premises and future Or...


