Paul Czarkowski is a Managed OpenShift Black Belt at Red Hat. Paul helps people be more awesome by helping to close the gap between the various teams in the IT organization from Development to Operations to Security and everyone in between.
Paul has over 20 years of experience running IT systems, ranging from small startups like Bluebox to large video games publishers like Electronic Arts where honed his skills in Operations and Automation before moving into his current role.
In the last few years Paul has spoken at a number of events from Openstack Summits, DockerCon, and KubeCon to name a few. Paul loves to speak at smaller gatherings, local tech meetups etc as well.
Paul is also really slack about keeping this page up to date.
Large language models (LLMs) and generative AI tools are more accessible than ever—but do you really want to send all your data to the cloud? In this talk, we’ll explore how to build your own private AI assistant, running entirely on open-source software and self-hosted hardware.
We’ll cover:
✅ Hardware choices—from budget-friendly setups to high-performance AI rigs.
✅ Hosting LLMs locally with Ollama and picking the right models for your needs.
✅ Connecting AI to external tools—from coding assistance in your IDE to reading and analyzing legal documents.
✅ Generating images with Stable Diffusion
✅ Ditching Alexa—integrating voice-controlled AI with Home Assistant for fully private home automation.
Whether you’re looking to enhance your workflow, boost privacy, or just tinker with AI at home, this session will give you everything you need to get started—without breaking the bank or relying on Big Tech.