Cloudflare integrates OpenAI's GPT-4.1 and Codex into its Agent Cloud, giving enterprises a secure, scalable platform for deploying AI agents.
Cloudflare has added OpenAI's GPT-4.1 and Codex models to its Agent Cloud platform, targeting enterprise customers building and scaling agentic workflows. The integration positions Cloudflare's edge infrastructure as the deployment layer for AI agents handling real-world tasks. This gives enterprises a combined offering of OpenAI's frontier models with Cloudflare's global network, security primitives, and developer tooling. Availability appears to be aimed at existing Cloudflare enterprise customers.
Cloudflare Agent Cloud now gives developers access to GPT-4.1 and Codex inside an infrastructure layer that handles routing, security, and global edge deployment out of the box. This matters because agentic workloads — multi-step, tool-using, long-running — have unique infra requirements that generic cloud functions don't handle well. Cloudflare's Workers and Durable Objects architecture is actually a solid fit for stateful agents, and adding OpenAI models natively removes one integration hurdle.
If you're already running agents on Lambda or GCP Cloud Run, spin up a Cloudflare Worker with the OpenAI Codex integration this week and benchmark cold-start latency and per-agent cost against your current stack.
Go to dash.cloudflare.com, navigate to Workers & Pages, and create a new Worker
Tags
Also today
Signals by role
Also today
Tools mentioned