Quarkus meets AI: Build your own LLM-powered application

less than 1 minute read

I recently had the pleasure of co-presenting a talk with Dimitris Andreadis at Devoxx Greece, and I’m excited to share the recording with you.

In this session, we explored how Quarkus, a framework for building Cloud-Native applications in Java, can be used to integrate Large Language Models (LLMs) into your application. We also introduced Langchain4j, a powerful library designed to seamlessly integrate Java applications with LLMs.

One of the highlights of our talk was the RAG(Retrieval Augmented Generation) demo, where we showcased how to leverage existing features from the ecosystem to create effective strategies for data retrieval. In this demo, we demonstrated how to use Quarkus and Langchain4j to retrieve data from a SQL database using LLMs, generating SQL queries dynamically based on user input.

To further enhance our demo, we used Apache Camel to seamlessly transform and ingest into a database. This allowed us to showcase how Camel’s routing engine can be used to prepare any type of data for RAG.

You can access the recording below:

If you have any questions or feedback, please don’t hesitate to reach out. I’d love to hear your thoughts on this exciting topic!

Enjoy the talk!

Comments