Agentic workflow considerations for enterprise
Putting an LLM in production does not mean you have an agent. Having it make a decision gets you closer. However, the real test is if that LLM is part of an orchestrative workflow that allows for multiple iterations. In this session, we look into the key components that make up this emerging pattern.
What do superhero battles and modern workflows have in common? Both need cleanup—and coordination. In this talk, we’ll explore how Orkes Conductor can be used to build agentic workflows inspired by the chaos of the Marvel universe. You’ll see how to model a real-world insurance claim system using human-in-the-loop tasks, conditional logic with switches, iterative loops, and LLM-powered decisions. Whether you're orchestrating microservices or enabling dynamic decision-making in your app, this session will give you a practical and visual breakdown of what it means to build workflows that are adaptive, explainable, and production-ready.
Agentic workflow considerations for enterprise
Putting an LLM in production does not mean you have an agent. Having it make a decision gets you closer. However, the real test is if that LLM is part of an orchestrative workflow that allows for multiple iterations. In this session, we look into the key components that make up this emerging pattern.
Panelist

Panelist

Panelist

Moderator

Michael Liendo
Senior Developer Advocate, Orkes
What do superhero battles and modern workflows have in common? Both need cleanup—and coordination. In this talk, we’ll explore how Orkes Conductor can be used to build agentic workflows inspired by the chaos of the Marvel universe. You’ll see how to model a real-world insurance claim system using human-in-the-loop tasks, conditional logic with switches, iterative loops, and LLM-powered decisions. Whether you're orchestrating microservices or enabling dynamic decision-making in your app, this session will give you a practical and visual breakdown of what it means to build workflows that are adaptive, explainable, and production-ready.
Agentic workflow considerations for enterprise
Putting an LLM in production does not mean you have an agent. Having it make a decision gets you closer. However, the real test is if that LLM is part of an orchestrative workflow that allows for multiple iterations. In this session, we look into the key components that make up this emerging pattern.
What do superhero battles and modern workflows have in common? Both need cleanup—and coordination. In this talk, we’ll explore how Orkes Conductor can be used to build agentic workflows inspired by the chaos of the Marvel universe. You’ll see how to model a real-world insurance claim system using human-in-the-loop tasks, conditional logic with switches, iterative loops, and LLM-powered decisions. Whether you're orchestrating microservices or enabling dynamic decision-making in your app, this session will give you a practical and visual breakdown of what it means to build workflows that are adaptive, explainable, and production-ready.
Agentic workflow considerations for enterprise
Putting an LLM in production does not mean you have an agent. Having it make a decision gets you closer. However, the real test is if that LLM is part of an orchestrative workflow that allows for multiple iterations. In this session, we look into the key components that make up this emerging pattern.
Panelist

Panelist

Panelist

Host

Michael Liendo
Senior Developer Advocate, Orkes
Sign up now

