Loading...
Loading...
Browse all stories on DeepNewz
VisitWhat will be the most popular deployment environment for NVIDIA NIM by end of 2024?
Cloud • 33%
On-premise • 33%
Air-gapped • 33%
NVIDIA's official reports or surveys conducted by third-party analysts
NVIDIA Announces Global Availability of NIM at COMPUTEX 2024 to Simplify Generative AI Deployment with 150+ Partners
Jun 3, 2024, 05:00 PM
NVIDIA has announced the global availability of its NIM (NVIDIA Inference Microservices), a platform designed to simplify the deployment of generative AI applications. Unveiled at COMPUTEX 2024, NVIDIA NIM aims to revolutionize model deployment for enterprise applications, with over 150 partners across the AI ecosystem. The platform integrates with KServe to ensure AI models can be deployed like any other large enterprise application. NVIDIA NIM is available for deployment in the cloud, on-premise, or in air-gapped environments, providing a secure and production-optimal infrastructure. Additionally, WhyLabs has integrated with NVIDIA NIM to enable tracing and guardrails, enhancing the security and efficiency of generative AI models. Haystack RAG pipelines can also be deployed with NIMs. The announcement was made on June 2, 2024, and reiterated on June 3, 2024.
View original story
Highest in tech • 20%
Top 3 • 20%
Top 5 • 20%
Top 10 • 20%
Outside top 10 • 20%
First • 25%
Second • 25%
Third • 25%
Fourth or lower • 25%
1st • 25%
2nd • 25%
3rd • 25%
Below 3rd • 25%
Top 1 • 25%
Top 2 • 25%
Top 3 • 25%
Outside Top 3 • 25%
Web • 33%
Mobile App • 33%
Desktop Software • 33%
No • 50%
Yes • 50%
200 partners • 25%
350 or more partners • 25%
300 partners • 25%
250 partners • 25%