At Omnilex, weâre on a mission to transform the way lawyers work. Our AI-native platform lets legal professionals enhance their productivity in legal research and automate workflows. We collaborate closely with our clients and iterate at a market-leading pace. In a year, we have gone from an early MVP to a product used daily by thousands of legal professionals at our clients in Switzerland, Germany and Liechtenstein - and are now scaling rapidly across Europe.
We already stand out with handling unique challenges, including our combination of external data, customer-internal data and our own innovative AI-first legal commentaries.
Youâll be joining a young, passionate, and dynamic team of 15, with roots at ETH Zurich.
You like the last mile â the part where an AI product stops being a demo and starts surviving real life: inconsistent documents, weird naming conventions, strict access rules, stakeholders who notice every edge case, and workflows that were never designed for âAI assistants.â
Youâre the person who can sit with a legal team, understand what they actually need, translate that into system behavior, and then implement it cleanly. You enjoy being the connective tissue between customers, domain experts, and the core engineering teamâshipping practical improvements and leaving behind crisp documentation so the next rollout is smoother.
As a Forward Deployed AI Engineer, your mission is to bring Omnilex into customer environments and make it work exceptionally wellâthen turn what you learn into reusable product capabilities.
Lead technical onboarding for new customers: ingest documents, build indexes, map metadata (jurisdiction, authority, recency), and run validation checks
Tune retrieval and reranking behavior to match customer expectations (practice area focus, internal taxonomies, document patterns, relevance definitions)
Deliver customer-specific UX and workflow adaptations: templates, default filters, jurisdiction presets, citation formatting, permission-aware retrieval, and customized result views
Adjust prompting and context strategies to meet strict requirements (grounding, traceability, citation style, explanation depth, fallback behavior)
Build and enforce guardrails: provenance tracking, source-grounded generation, âno source â no statementâ rules, and risk-aware uncertainty patterns suitable for legal contexts
Create small but high-signal evaluation sets per customer (gold questions, acceptance criteria, âcannot failâ scenarios)
Perform fast failure analysis and ship improvements: chunking changes, deduping, reranker adjustments, query interpretation tweaks, caching, and routing strategies
Keep response times and usage costs sane through batching, caching, early exits, and practical fallback paths
Track quality signals and usage patterns; convert feedback into measurable fixes and clear acceptance tests
Work closely with Customer Success and legal experts to convert pain into engineering work
Write deployment playbooks and integration ârecipesâ so customer solutions become repeatable patterns over time
Strong practical experience building or adapting search/retrieval systems in production (hybrid retrieval, reranking, indexing, query understanding)
Experience taking LLM features from prototype to stable, real-world usage
Solid TypeScript/Node.js skills (our core stack)
Hands-on experience with at least one of: Azure AI Search, pgvector/PostgreSQL, OpenSearch/Elasticsearch (or comparable systems)
Strong engineering judgment: debugging skills, performance tuning, careful edge-case handling, and operational thinking
Comfortable working directly with customers: deep technical sessions, trade-off explanations, and clear written documentation
Fluent English; available full-time.
Hybrid setup: at least two days per week on-site in Zurich.
German proficiency (many sources and stakeholder conversations are German-speaking)
Experience integrating customer document sources and pipelines (connectors, ETL, access controls)
Experience with lightweight evaluation processes (human labeling loops, basic agreement checks, simple dashboards)
Familiarity with sparse + dense retrieval approaches (BM25 variants included)
Experience running and operating services (Docker a plus)
Familiarity with Azure / NestJS / Next.js
Exposure to Swiss / German / US legal systems
Tangible customer impact: your work directly affects daily trust and adoption inside legal teams
High ownership: you run deployments end-to-end and help define reusable solution patterns
Fast feedback loops: youâll see real failure modes early and influence product direction with evidence
Compensation: CHF 8â000â12â000 per month + ESOP, depending on experience and skills
omnilex