Pedro Bertoluchi

Applied AI Integration

RAG, conversational chatbots, tool-calling agents and unstructured data ingestion, on Azure OpenAI, OpenAI, Anthropic and other providers with Semantic Kernel and LangChain.

Runs on Azure, AWS, GCP or your current infrastructure. I suggest what makes the most sense for your case.

Back to services

When you call me in

Your company has data, knowledge and repeating processes, but off-the-shelf chatbots don't fit and SaaS providers don't understand your domain. Azure OpenAI, OpenAI, Anthropic or another. I pick based on the case.

What I deliver

  • RAG with embeddings and a vector store sized for your volume

  • Ingestion pipelines (PDF, spreadsheet, voice, email) with traceability

  • Answer evaluation and regression (golden set + metrics)

  • Prompt, cost and PII guardrails with observability

Typical stack

  • Azure OpenAI
  • Semantic Kernel
  • LangChain
  • FastAPI
  • Document Intelligence
  • Pinecone/Postgres pgvector

Frequently asked

Time to a useful PoC?

Three to four weeks for something your team can test in internal production.

OpenAI direct, Azure or another provider?

I pick based on the case: Azure OpenAI when the client already runs on Azure and wants centralised compliance, OpenAI direct for simplicity, Anthropic when the model matters. Selection is by cost/quality evaluation, not hype.

What about token cost?

It's part of scope. Model selection is by cost/quality evaluation, not hype.

How do you measure quality?

Versioned golden set + sampled human review + feedback telemetry.

Ready to start?

30-minute call. No cost, no pitch.