Oracle has announced new distributed cloud innovations in Oracle Cloud Infrastructure (OCI) to meet growing global demand for AI and cloud services. These latest developments include Oracle Database@AWS, Oracle Database@Azure, Oracle Database@Google Cloud, OCI Dedicated Region, and OCI Supercluster, allowing customers to deploy OCI’s services at the edge, across clouds, or in the public cloud.
Oracle Cloud now operates in 85 regions globally, with plans for an additional 77 regions, making it available in more locations than any other hyperscaler.
In a statement, Mahesh Thiagarajan, executive vice president of Oracle Cloud Infrastructure, said, “Our priority is giving customers the choice and flexibility to leverage cloud services in the model that makes the most sense for their business.”
OCI’s distributed cloud offers flexibility for deploying AI infrastructure, addressing data privacy and low-latency requirements.
Among its latest offerings, Oracle is taking orders for what it claims to be the largest AI supercomputer in the cloud, with up to 131,072 NVIDIA Blackwell GPUs delivering 2.4 zettaFLOPS of peak performance. This supercomputer provides more than six times the GPU capacity of other hyperscalers, along with enhanced storage and low-latency networks.
Oracle also introduced new infrastructure to support sovereign AI, using NVIDIA L40S and Hopper architecture GPUs. These advancements enable organisations to build AI models with strong data residency controls.
Further expanding its capabilities, Oracle announced a smaller, more scalable OCI Dedicated Region configuration, “Dedicated Region25,” starting at only three racks, set to launch next year. This new offering aims to bring OCI’s full range of AI and cloud services to more customers, particularly those looking for a localized cloud solution.
Additionally, Oracle has enhanced multi cloud capabilities through partnerships with AWS, Azure, and Google Cloud, allowing customers to combine services from multiple providers to optimize performance and costs.
Finally, OCI introduced Roving Edge Infrastructure enhancements, including a new three-GPU option designed for remote AI inferencing. The ruggedized, portable device supports critical data processing even in disconnected environments.