Senior GenAI Engineer

For LATAM

What you’ll be doing?

As a Senior GenIA Engineer at DinoCloud, you will participate in the end-to-end cycle of data analysis, ML, and GenAI in AWS: from EDA and experimental design, to training, evaluation, deployment, and operation of ML models and LLM applications (RAG) in production.

  • Design architectures based on Generative AI and project planning.
  • Design and deploy Generative AI models with rigorous cost, latency, and traceability evaluation. Build LLM applications using Amazon Bedrock (Agents, Knowledge Bases, guardrails, quality evaluation, and hallucination detection) and RAG over internal data.
  • Integrate and operate Generative AI or AI Agent architectures: conversational chatbots, process automation, personalization, and enhancement of user experience through LLM-driven decision-making.
  • Ensure security, governance, cost control, and reliability; participate in discovery and project prioritization.
  • Implement MLOps pipelines for the complete lifecycle of generative models, including versioning, A/B testing, and automated deployments.
  • Collaborate with engineering and business teams to accelerate use cases, enable access to information through natural language, and generate measurable impact on our clients’ processes.

What would you need to succeed in this role?

  • Experience with Amazon Bedrock in production environments, including: Agents, Knowledge Bases (embeddings/vector stores), guardrails, and model selection and evaluation (Claude, Titan, or others).
  • Hands-on experience in software development and software architecture design related to AI Agent or Generative AI projects.
  • Practical experience in Generative AI or AI Agent architectures:
  • Design and implementation of Generative AI-based architectures.
  • Development and integration of LLM models.
  • Management of databases for storing conversational context.
  • Experience with RAG-based solutions: retrieval design, chunking and embedding strategies, evaluation, latency and cost optimization.
  • Strong knowledge of security and compliance (PII handling, IAM, encryption in transit and at rest).
  • English level C1 for technical communication and documentation.
It is a plus if you have:
  • Experience with Spark/Databricks, Snowflake; Feature Stores; large-scale experimentation; A/B testing.
  • Strong background in data architectures: data warehouses, data lakes, event-driven and streaming architectures.
  • Knowledge in observability and monitoring (CloudWatch, Prometheus, custom metrics).
  • Experience in BI and visualization (QuickSight, Power BI, Looker) and data storytelling.
  • Containerization and orchestration (Docker, Kubernetes, Amazon ECS/EKS).
    Relevant AWS certifications (ML Specialty, Data Engineer, Solutions Architect).
    Experience with Amazon Q, LangChain/LlamaIndex, OpenSearch/Pinecone/FAISS.
  • Knowledge in model fine-tuning and optimization techniques (LoRA, QLoRA).