I’m helping Century 21 Accent Homes find a topcandidate to join their team full-time for the role of Kafka + DataPipeline Engineer (Remote / Contract). This is a cool job becauseyou’ll get to build advanced messaging systems with Kafka & AItech for a legacy-driven real estate leader since 1973. Compensation: USD 2.8K - 4K/month. Location: Remote (anywhere). Mission of Century 21 Accent Homes: “We are pro-actively promotingthe interests of our clients through real estate target marketing,honest evaluation of property values, and open communications.”What makes you a strong candidate: 1. You are proficient in KsqlDB,Node.js and Python. 2. English - Conversational. Responsibilities:1. Design and implement Kafka topics for processing communicationdata from Gmail, LeadSimple, and other sources. 2. Build amiddleware API to: 1. Trigger on incoming Gmail messages. 2. Enrichmessages using LeadSimple’s API. 3. (Optionally) perform RentManager lookups for property metadata. 4. Filter out spam andirrelevant messages using logic such as sender whitelisting orLeadSimple contact validation. 5. Maintain a conversation_contexttable using Tableflow or ksqlDB for enriched thread metadata. 6. Ensure messages are properly routed to Kafka (communication-events)with clean, AI-ready JSON. 7. Collaborate with internal teams todefine message schemas, enrichment fields, and downstream AI usecases. 8. Set up lightweight monitoring and logging for errors andfailed enrichments. 9. Advise on infrastructure best practices(e.g., using Redis for caching, managing Pub/Sub backpressure,etc.). Requirements: 1. Proven experience working with Kafka(Confluent Cloud or self-hosted). 2. Hands-on experience with: 1. Kafka Streams or ksqlDB. 2. REST API integrations (especially GmailAPI and/or CRMs like LeadSimple). 3. Proficiency in Python,Node.js, or similar backend languages. 4. Familiarity withevent-driven architecture and streaming design patterns. 5. Experience with at least one cloud provider (AWS, GCP, or Azure).6. Solid understanding of asynchronous job handling, retry logic,and webhook workflows. 7. Ability to structure clean, enriched JSONevents for AI, analytics, and automation. 8. Excellentcommunication skills to clearly explain technical concepts toops-minded teams. 4. Nice-to-Have: 1. Experience with OpenAI API,LangChain, or vector databases like Pinecone or Chroma. 2. Experience building agent-style tools (like Slack bots or AIcopilots). 3. Prior exposure to property management systems orservice scheduling tools. 4. Experience with Tableflow or otherstreaming-state tools. #J-18808-Ljbffr Engineering