kernelcdub's profile picture. Red Hat CTO.  Tezos Foundation council member. Passion for open source SW innovation. Father and husband. Cyclist. Human.

Chris Wright

@kernelcdub

Red Hat CTO. Tezos Foundation council member. Passion for open source SW innovation. Father and husband. Cyclist. Human.

Chris Wright 已轉發

Enjoyed spending time in London yesterday with top industry analysts and other @RedHat leaders. It was great to discuss how our open hybrid cloud strategy is helping customers navigate a complex IT landscape. Our open approach for AI is a critical part of that journey.

matthicksj's tweet image. Enjoyed spending time in London yesterday with top industry analysts and other @RedHat leaders. It was great to discuss how our open hybrid cloud strategy is helping customers navigate a complex IT landscape. Our open approach for AI is a critical part of that journey.

AI is changing how we do everything. Lots of discussion about AI in the sw development But not enough about how open source sw development will evolve with AI. Let's have that discussion. redhat.com/en/blog/ai-ass…


Chris Wright 已轉發

Using light to transmit data rather than relying on electronic components could slash latency itpro.com/infrastructure…


vLLM day zero support for Qwen3-Next...Hybrid attention, sparse MoE, and multi-token prediction...keep pushing fwd!

Qwen3-Next dropped yesterday and you can run it with Red Hat AI today. ✅ Day-zero support in vLLM ✅ Day-one deployment with Red Hat AI Step-by-step guide: developers.redhat.com/articles/2025/… The future of AI is open.



Check out my new Technically Speaking episode with Bernd Greifeneder from @Dynatrace. We're talking about taming AI agents with observability and why being able to trust these systems is so critical. Give it a listen. youtube.com/watch?v=h-L0R_…

kernelcdub's tweet card. Taming AI agents with observability ft. Bernd Greifeneder | Technic...

youtube.com

YouTube

Taming AI agents with observability ft. Bernd Greifeneder | Technic...


It's awesome to see how we're making the OS feel faster and more secure for our customers with things like Lightspeed, Insights, image mode, and post-quantum security. This is what it’s all about—giving people the tools to move at the speed of innovation. youtube.com/watch?v=dhWW2U…


Chris Wright 已轉發

.@kernelcdub and Carlos Costa explain how a "shared vision" allows an entire industry, from hardware providers to cloud companies, to solve common pain points together and create a "growing pie" for everyone. Hear more about the power of open source collaboration in the full…


Chris Wright 已轉發

Ever wonder what the 'v' in vLLM stands for? 💡 @kernelcdub and @nickhill33 explain how "virtual" memory and PagedAttention make AI inference more efficient by solving GPU memory fragmentation. Tune into the full Technically Speaking episode to learn more about optimizing LLMs:…


Nick Hill digs into the details of vLLM with me on Technically Speaking. Helpful in understanding why vLLM is so important in high performance, open source AI inferencing

How do you solve AI's biggest performance hurdles? On Technically Speaking, @kernelcdub & Nick Hill dive into vLLM, exploring how techniques like PagedAttention solve memory bottlenecks & accelerate inference: red.ht/4lDjJ5P.

RedHat's tweet image. How do you solve AI's biggest performance hurdles? On Technically Speaking, @kernelcdub & Nick Hill dive into vLLM, exploring how techniques like PagedAttention solve memory bottlenecks & accelerate inference: red.ht/4lDjJ5P.


Great discussion with @AMD about how we see future of AI and the importance of openness and choice.

The future of AI is being built on openness, efficiency, and choice, and AMD is helping make it possible. Chris Wright, CTO and SVP at @RedHat, explains the state of AI. Open AI is accelerating - from open source frameworks to open LLMs that now match or even exceed…

AMD's tweet image. The future of AI is being built on openness, efficiency, and choice, and AMD is helping make it possible. 

Chris Wright, CTO and SVP at @RedHat, explains the state of AI.  

Open AI is accelerating - from open source frameworks to open LLMs that now match or even exceed…


It's tools, it's models, it's agents...the future of #AI is open source and collaboration. We have to build this together.

"The future of AI is going to be open sourced." In this clip, @kernelcdub of @RedHat explains why open source, open collaboration, and global ecosystems will define the next wave of AI, and how Red Hat empowers developers everywhere



It's all about open! Great collaboration across #vLLM and #llm-d to create efficiency for inference


Rapid fire buzzword decoding with @addvin

What's the first thing that comes to mind when tech leaders hear today's top AI & enterprise buzzwords? 🤔 Watch Red Hat CTO @kernelcdub & AI CTO @addvin tackle them in a quick-fire word association game! Don't miss the latest episode of Technically Speaking 🎙️for their deep-dive…



Hope you enjoy this great discussion I had with @addvin on Technically Speaking. The future of open source AI inference!

AI Inference at scale? 🤔 Open source to the rescue! 💡 Dive into #vLLM, distributed systems, and the path to practical #AI in the enterprise with @kernelcdub and @addvin on the latest "Technically Speaking." red.ht/3FEEZJ6

RedHat's tweet image. AI Inference at scale? 🤔 Open source to the rescue! 💡 Dive into #vLLM, distributed systems, and the path to practical #AI in the enterprise with @kernelcdub and @addvin on the latest "Technically Speaking." red.ht/3FEEZJ6


Chris Wright 已轉發

AI Inference at scale? 🤔 Open source to the rescue! 💡 Dive into #vLLM, distributed systems, and the path to practical #AI in the enterprise with @kernelcdub and @addvin on the latest "Technically Speaking." red.ht/3FEEZJ6

RedHat's tweet image. AI Inference at scale? 🤔 Open source to the rescue! 💡 Dive into #vLLM, distributed systems, and the path to practical #AI in the enterprise with @kernelcdub and @addvin on the latest "Technically Speaking." red.ht/3FEEZJ6

Here's how we're achieving R1 like reasoning with small models leveraging probabalistic inference-time scaling w/out using DeepSeek or derivatives. Results are impressive! MATH w/ Llama 8B approaches GPT-4o, and w/ Qwen2.5 Math 7B Instruct hits o1 level. red-hat-ai-innovation-team.github.io/posts/r1-like-…


Inference-time scaling brings smaller LLMs to o1 level capabilities. This is why we're so excited about the potential of smaller, open source models. Awesome work @ishapuri101 and @RedHat AI Innovation team!

[1/x] can we scale small, open LMs to o1 level? Using classical probabilistic inference methods, YES! Joint @MIT_CSAIL / @RedHat AI Innovation Team work introduces a particle filtering approach to scaling inference w/o any training! check out …abilistic-inference-scaling.github.io

ishapuri101's tweet image. [1/x] can we scale small, open LMs to o1 level? Using classical probabilistic inference methods, YES! Joint @MIT_CSAIL / @RedHat AI Innovation Team work introduces a particle filtering approach to scaling inference w/o any training! check out …abilistic-inference-scaling.github.io


Chris Wright 已轉發

We agree, @kernelcdub: the future of AI is open source 👐 Onsite at #AWSreInvent? Stop by booth #844 to see how you can transform your business with open technology, open culture, and open processes: red.ht/3B0QA2G


Loading...

Something went wrong.


Something went wrong.