LLMs unleashed: An IT/DevOps adventure with Amazon Bedrock
This session explores how large language models (LLMs) can be applied in IT and DevOps workflows using Amazon Bedrock, with practical examples of how generative AI can enhance log analysis, automate infrastructure tasks, and streamline operations.
While LLMs have gained widespread attention, their practical application in IT and DevOps environments is still evolving. In this session, Elad Hirsch demonstrates how generative AI, powered by Amazon Bedrock, can move beyond proof of concept to bring real value in production environments.
Hirsch will provide an overview of LLM fundamentals and prompt engineering before diving into use cases tailored for platform and DevOps teams. Through a live demo, he will showcase how LLMs can be used to extract entities from logs, create retrieval-augmented generation (RAG) knowledge bases, and automate Terraform processes. The session will also address key challenges around LLM operations, including setting guardrails and monitoring AI behavior using Amazon CloudWatch.
Attendees will leave with a clearer understanding of how to integrate LLMs into existing DevOps pipelines to improve efficiency, reduce manual effort, and unlock new capabilities within their teams.
Hirsch will provide an overview of LLM fundamentals and prompt engineering before diving into use cases tailored for platform and DevOps teams. Through a live demo, he will showcase how LLMs can be used to extract entities from logs, create retrieval-augmented generation (RAG) knowledge bases, and automate Terraform processes. The session will also address key challenges around LLM operations, including setting guardrails and monitoring AI behavior using Amazon CloudWatch.
Attendees will leave with a clearer understanding of how to integrate LLMs into existing DevOps pipelines to improve efficiency, reduce manual effort, and unlock new capabilities within their teams.
LLMs unleashed: An IT/DevOps adventure with Amazon Bedrock
This session explores how large language models (LLMs) can be applied in IT and DevOps workflows using Amazon Bedrock, with practical examples of how generative AI can enhance log analysis, automate infrastructure tasks, and streamline operations.
Panelist

Panelist

Panelist

Moderator

Elad Hirsch
Tech Lead, CTO Office, TeraSky
While LLMs have gained widespread attention, their practical application in IT and DevOps environments is still evolving. In this session, Elad Hirsch demonstrates how generative AI, powered by Amazon Bedrock, can move beyond proof of concept to bring real value in production environments.
Hirsch will provide an overview of LLM fundamentals and prompt engineering before diving into use cases tailored for platform and DevOps teams. Through a live demo, he will showcase how LLMs can be used to extract entities from logs, create retrieval-augmented generation (RAG) knowledge bases, and automate Terraform processes. The session will also address key challenges around LLM operations, including setting guardrails and monitoring AI behavior using Amazon CloudWatch.
Attendees will leave with a clearer understanding of how to integrate LLMs into existing DevOps pipelines to improve efficiency, reduce manual effort, and unlock new capabilities within their teams.
Hirsch will provide an overview of LLM fundamentals and prompt engineering before diving into use cases tailored for platform and DevOps teams. Through a live demo, he will showcase how LLMs can be used to extract entities from logs, create retrieval-augmented generation (RAG) knowledge bases, and automate Terraform processes. The session will also address key challenges around LLM operations, including setting guardrails and monitoring AI behavior using Amazon CloudWatch.
Attendees will leave with a clearer understanding of how to integrate LLMs into existing DevOps pipelines to improve efficiency, reduce manual effort, and unlock new capabilities within their teams.
LLMs unleashed: An IT/DevOps adventure with Amazon Bedrock
This session explores how large language models (LLMs) can be applied in IT and DevOps workflows using Amazon Bedrock, with practical examples of how generative AI can enhance log analysis, automate infrastructure tasks, and streamline operations.
While LLMs have gained widespread attention, their practical application in IT and DevOps environments is still evolving. In this session, Elad Hirsch demonstrates how generative AI, powered by Amazon Bedrock, can move beyond proof of concept to bring real value in production environments.
Hirsch will provide an overview of LLM fundamentals and prompt engineering before diving into use cases tailored for platform and DevOps teams. Through a live demo, he will showcase how LLMs can be used to extract entities from logs, create retrieval-augmented generation (RAG) knowledge bases, and automate Terraform processes. The session will also address key challenges around LLM operations, including setting guardrails and monitoring AI behavior using Amazon CloudWatch.
Attendees will leave with a clearer understanding of how to integrate LLMs into existing DevOps pipelines to improve efficiency, reduce manual effort, and unlock new capabilities within their teams.
Hirsch will provide an overview of LLM fundamentals and prompt engineering before diving into use cases tailored for platform and DevOps teams. Through a live demo, he will showcase how LLMs can be used to extract entities from logs, create retrieval-augmented generation (RAG) knowledge bases, and automate Terraform processes. The session will also address key challenges around LLM operations, including setting guardrails and monitoring AI behavior using Amazon CloudWatch.
Attendees will leave with a clearer understanding of how to integrate LLMs into existing DevOps pipelines to improve efficiency, reduce manual effort, and unlock new capabilities within their teams.
LLMs unleashed: An IT/DevOps adventure with Amazon Bedrock
This session explores how large language models (LLMs) can be applied in IT and DevOps workflows using Amazon Bedrock, with practical examples of how generative AI can enhance log analysis, automate infrastructure tasks, and streamline operations.
Panelist

Panelist

Panelist

Host

Elad Hirsch
Tech Lead, CTO Office, TeraSky
Sign up now

