Chris Benson

Chris Benson

Appears in 291 Episodes

AI predictions for 2024

We scoured the internet to find all the AI related predictions for 2024 (at least from people that might know what they are talking about), and, in this episode, we ta...

Open source, on-disk vector search with LanceDB

Prashanth Rao mentioned LanceDB as a stand out amongst the many vector DB options in episode #234. Now, Chang She (co-founder and CEO of LanceDB) joins us to talk thro...

The state of open source AI

The new open source AI book from PremAI starts with “As a data scientist/ML engineer/developer with a 9 to 5 job, it’s difficult to keep track of all the innovations.”...

The OpenAI debacle (a retrospective)

Daniel & Chris conduct a retrospective analysis of the recent OpenAI debacle in which CEO Sam Altman was sacked by the OpenAI board, only to return days later with a n...

Generating product imagery at Shopify

Shopify recently released a Hugging Face space demonstrating very impressive results for replacing background scenes in product imagery. In this episode, we hear the b...

AI trailblazers putting people first

According to Solana Larsen: “Too often, it feels like we have lost control of the internet to the interests of Big Tech, Big Data — and now Big AI.” In the latest seas...

Government regulation of AI has arrived

On Monday, October 30, 2023, the U.S. White House issued its Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence. Two d...

Self-hosting & scaling models

We’re excited to have Tuhin join us on the show once again to talk about self-hosting open access models. Tuhin’s company Baseten specializes in model deployment and m...

Deep learning in Rust with Burn 🔥

It seems like everyone is interested in Rust these days. Even the most popular Python linter, Ruff, isn’t written in Python! It’s written in Rust. But what is the stat...

Generative models: exploration to deployment

What is the model lifecycle like for experimenting with and then deploying generative AI models? Although there are some similarities, this lifecycle differs somewhat ...

Automate all the UIs!

Dominik Klotz from askui joins Daniel and Chris to discuss the automation of UI, and how AI empowers them to automate any use case on any operating system. Along the w...

Fine-tuning vs RAG

In this episode we welcome back our good friend Demetrios from the MLOps Community to discuss fine-tuning vs. retrieval augmented generation. Along the way, we also ch...

Automating code optimization with LLMs

You might have heard a lot about code generation tools using AI, but could LLMs and generative AI make our existing code better? In this episode, we sit down with Mike...

The new AI app stack

Recently a16z released a diagram showing the “Emerging Architectures for LLM Applications.” In this episode, we expand on things covered in that diagram to a more gene...

Blueprint for an AI Bill of Rights

In this Fully Connected episode, Daniel and Chris kick it off by noting that Stability AI released their SDXL 1.0 LLM! They discuss its virtues, and then dive into a d...

Vector databases (beyond the hype)

There’s so much talk (and hype) these days about vector databases. We thought it would be timely and practical to have someone on the show that has been hands on with ...

There's a new Llama in town

It was an amazing week in AI news. Among other things, there is a new NeRF and a new Llama in town!!! Zip-NeRF can create some amazing 3D scenes based on 2D images, an...

Legal consequences of generated content

As a technologist, coder, and lawyer, few people are better equipped to discuss the legal and practical consequences of generative AI than Damien Riehl. He demonstrate...

A developer's toolkit for SOTA AI

Chris sat down with Varun Mohan and Anshul Ramachandran, CEO / Cofounder and Lead of Enterprise and Partnership at Codeium, respectively. They discussed how to streaml...

Cambrian explosion of generative models

In this Fully Connected episode, Daniel and Chris explore recent highlights from the current model proliferation wave sweeping the world - including Stable Diffusion X...

From ML to AI to Generative AI

Chris and Daniel take a step back to look at how generative AI fits into the wider landscape of ML/AI and data science. They talk through the differences in how one ap...

Accidentally building SOTA AI

Lately.AI has been working for years on content generation systems that capture your unique “voice” and are tailored to your unique audience. At first, they didn’t kno...

Controlled and compliant AI applications

You can’t build robust systems with inconsistent, unstructured text output from LLMs. Moreover, LLM integrations scare corporate lawyers, finance departments, and secu...

Data augmentation with LlamaIndex

Large Language Models (LLMs) continue to amaze us with their capabilities. However, the utilization of LLMs in production AI applications requires the integration of p...

The last mile of AI app development

There are a ton of problems around building LLM apps in production and the last mile of that problem. Travis Fischer, builder of open AI projects like @ChatGPTBot, joi...

Causal inference

With all the LLM hype, it’s worth remembering that enterprise stakeholders want answers to “why” questions. Enter causal inference. Paul Hünermund has been doing resea...

Capabilities of LLMs 🤯

Large Language Model (LLM) capabilities have reached new heights and are nothing short of mind-blowing! However, with so many advancements happening at once, it can be...

Computer scientists as rogue art historians

What can art historians and computer scientists learn from one another? Actually, a lot! Amanda Wasielewski joins us to talk about how she discovered that computer sci...

Accelerated data science with a Kaggle grandmaster

Daniel and Chris explore the intersection of Kaggle and real-world data science in this illuminating conversation with Christof Henkel, Senior Deep Learning Data Scien...

Explainable AI that is accessible for all humans

We are seeing an explosion of AI apps that are (at their core) a thin UI on top of calls to OpenAI generative models. What risks are associated with this sort of appro...

Broadcast by