[ad_1]
Why it matters: Microsoft and Nvidia have dramatically increased their focus on AI following the rising popularity of generative AI, but much of the technology relies on cloud servers. As AI-capable hardware begins to reach consumers, the two companies are unveiling tools to lessen users’ reliance on remote AI systems.
At the recent Ignite 2023 event, Microsoft and Nvidia revealed tools to help users develop and run generative AI applications locally. The new software leverages Windows 11’s increased focus on AI alongside popular AI models from Microsoft, Meta, and OpenAI.
Microsoft’s new Windows AI Studio consolidates numerous models and development tools from catalogs like Azure AI Studio and Hugging Face. It includes configuration interfaces, walkthroughs, and other instruments to help developers build and refine small language models.
Windows AI Studio lets users work with models like Meta’s Llama 2 and Microsoft’s Phi. Microsoft will initially release the workflow as a VS code extension in the coming weeks. Presumably, AI Studio’s local AI workloads could use hardware like Neural Processing Units, which will become prevalent in upcoming CPU generations.
Meanwhile, Nvidia announced a significant impending update to TensorRT-LLM, promising to expand and speed up AI applications on Windows 11 systems while keeping data on local systems without relying on cloud servers, which could address some users’ security concerns. The improvements will be available on laptops, desktops, and workstations with GeForce RTX graphics cards and at least 8GB of VRAM.
One new feature is a wrapper that makes TensorRT-LLM compatible with OpenAI’s Chat API. Furthermore, when version 0.6.0 arrives later this month, it will make AI inference operations five times faster and add support for new large language models like Mistral 7B and Nemotron-3 8B on any RTX 3000 or 4000 GPU with at least 8GB of memory.
The company will soon release the update on its GitHub repo and make the latest optimized AI models available at ngc.nvidia.com. Furthermore, those interested in the upcoming AI Workbench model customization toolkit can now join the early access list.
In related news, Microsoft has subsumed Bing’s AI-powered chatbot into the Copilot brand. Users who open the Bing chat window in Edge or the new Copilot assistant in Windows 11 might now see the name “Copilot with Bing Chat.”
Bing Chat initially appeared as a chatbot within Edge before the company brought its functionality into the Copilot assistant that debuted with the recent Windows 11 23H2 update. Unifying the features under one name could more firmly position the interface as Microsoft’s answer to ChatGPT.
[ad_2]
Source link