in ,

NVIDIA Launches NIM Microservices for Generative AI in Japan, Taiwan

Four new microservices help accelerate deployment of sovereign AI applications that offer advanced cultural and language fluency.

Nations around the world are pursuing sovereign AI to produce artificial intelligence using their own computing infrastructure, data, workforce and business networks to ensure AI systems align with local values, laws and interests.

In support of these efforts, NVIDIA today announced the availability of four new NVIDIA NIM microservices that enable developers to more easily build and deploy high-performing generative AI applications.

The microservices support popular community models tailored to meet regional needs. They enhance user interactions through accurate understanding and improved responses based on local languages and cultural heritage.

In the Asia-Pacific region alone, generative AI software revenue is expected to reach $48 billion by 2030 — up from $5 billion this year, according to ABI Research.

Llama-3-Swallow-70B, trained on Japanese data, and Llama-3-Taiwan-70B, trained on Mandarin data, are regional language models that provide a deeper understanding of local laws, regulations and other customs.

The RakutenAI 7B family of models, built on Mistral-7B, were trained on English and Japanese datasets, and are available as two different NIM microservices for Chat and Instruct. These models have achieved leading scores among open Japanese large language models, landing the top average score in the LM Evaluation Harness benchmark carried out from January to March 2024.

Training a large language model (LLM) on regional languages enhances the effectiveness of its outputs by ensuring more accurate and nuanced communication, as it better understands and reflects cultural and linguistic subtleties.

The models offer leading performance for Japanese and Mandarin language understanding, regional legal tasks, question-answering, and language translation and summarization compared with base LLMs like Llama 3.

Nations worldwide — from Singapore, the United Arab Emirates, South Korea and Sweden to France, Italy and India — are investing in sovereign AI infrastructure.

The new NIM microservices allow businesses, government agencies and universities to host native LLMs in their own environments, enabling developers to build advanced copilots, chatbots and AI assistants. 

 

Developing Applications With Sovereign AI NIM Microservices

Developers can easily deploy the sovereign AI models, packaged as NIM microservices, into production while achieving improved performance.

The microservices, available with NVIDIA AI Enterprise, are optimized for inference with the NVIDIA TensorRT-LLM open-source library.

NIM microservices for Llama 3 70B — which was used as the base model for the new Llama–3-Swallow-70B and Llama-3-Taiwan-70B NIM microservices — can provide up to 5x higher throughput. This lowers the total cost of running the models in production and provides better user experiences by decreasing latency.

The new NIM microservices are available today as hosted application programming interfaces (APIs).

 

Tapping NVIDIA NIM for Faster, More Accurate Generative AI Outcomes

The NIM microservices accelerate deployments, enhance overall performance and provide the necessary security for organizations across global industries, including healthcare, finance, manufacturing, education and legal.

The Tokyo Institute of Technology fine-tuned Llama-3-Swallow 70B using Japanese-language data.

“LLMs are not mechanical tools that provide the same benefit for everyone. They are rather intellectual tools that interact with human culture and creativity. The influence is mutual where not only are the models affected by the data we train on, but also our culture and the data we generate will be influenced by LLMs,” said Rio Yokota, professor at the Global Scientific Information and Computing Center at the

Tokyo Institute of Technology. “Therefore, it is of paramount importance to develop sovereign AI models that adhere to our cultural norms. The availability of Llama-3-Swallow as an NVIDIA NIM microservice will allow developers to easily access and deploy the model for Japanese applications across various industries.”

For instance, a Japanese AI company, Preferred Networks, uses the model to develop a healthcare specific model trained on a unique corpus of Japanese medical data, called Llama3-Preferred-MedSwallow-70B, that tops scores on the Japan National Examination for Physicians.

Chang Gung Memorial Hospital (CGMH), one of the leading hospitals in Taiwan, is building a custom-made AI Inference Service (AIIS) to centralize all LLM applications within the hospital system. Using Llama 3-Taiwan 70B, it is improving the efficiency of frontline medical staff with more nuanced medical language that patients can understand.

“By providing instant, context-appropriate guidance, AI applications built with local-language LLMs streamline workflows and serve as a continuous learning tool to support staff development and improve the quality of patient care,” said Dr. Changfu Kuo, director of the Center for Artificial Intelligence in Medicine at CGMH, Linko Branch. “NVIDIA NIM is simplifying the development of these applications, allowing for easy access and deployment of models trained on regional languages with minimal engineering expertise.”

Taiwan-based Pegatron, a maker of electronic devices, will adopt the Llama 3-Taiwan 70B NIM microservice for internal- and external-facing applications. It has integrated it with its PEGAAi Agentic AI System to automate processes, boosting efficiency in manufacturing and operations.

Llama-3-Taiwan 70B NIM is also being used by global petrochemical manufacturer

Chang Chun Group, world-leading printed circuit board company Unimicron, technology-focused media company TechOrange, online contract service company LegalSign.ai and generative AI startup APMIC. These companies are also collaborating on the open model.

 

Creating Custom Enterprise Models With NVIDIA AI Foundry

While regional AI models can provide culturally nuanced and localized responses, enterprises still need to fine-tune them for their business processes and domain expertise.

NVIDIA AI Foundry is a platform and service that includes popular foundation models, NVIDIA NeMo for fine-tuning, and dedicated capacity on NVIDIA DGX Cloud to provide developers a full-stack solution for creating a customized foundation model packaged as a NIM microservice.

Additionally, developers using NVIDIA AI Foundry have access to the NVIDIA AI Enterprise software platform, which provides security, stability and support for production deployments.

NVIDIA AI Foundry gives developers the necessary tools to more quickly and easily build and deploy their own custom, regional language NIM microservices to power AI applications, ensuring culturally and linguistically appropriate results for their users.

 

Written by: Kari Briski, vice president, AI Software

Written by dotdailydose

Comments

Leave a Reply
  1. I am really impressed with your writing skills aswell as with the layout on your blog. Is this a paid themeor did you modify it yourself? Either way keep up theexcellent quality writing, it is rare to see a nice blog like this one today.

  2. It’s actually a nice and helpful piece of information. I’m satisfied that you shared this useful information with us. Please stay us up to date like this. Thanks for sharing.

  3. Wow that was odd. I just wrote an incredibly longcomment but after I clicked submit my comment didn’t appear.Grrrr… well I’m not writing all that over again. Regardless, just wanted to say wonderfulblog!

  4. An intriguing discussion is definitely worth comment. I believe that you should publish more about this issue, it might not be a taboo matter but generally people do not discuss such topics. To the next! Many thanks.

  5. I was curious if you ever thought of changing the structure of your blog? Its very well written; I love what youve got to say. But maybe you could a little more in the way of content so people could connect with it better. Youve got an awful lot of text for only having 1 or 2 pictures. Maybe you could space it out better?

  6. Thanks for your personal marvelous posting! I seriously enjoyed reading it, you could be a great author.I will make certain to bookmark your blog and will come back in the foreseeable future. I want to encourage you to continue your great posts, have a nice weekend!

  7. This is really fascinating, You are an overly skilled blogger. I’ve joined your feed and stay up for in the hunt for extra of your great post. Additionally, I’ve shared your website in my social networks!

  8. Hi, Neat post. There is a problem with your web site in internet explorer, would check this… IE still is the market leader and a large portion of people will miss your wonderful writing because of this problem.

  9. A person essentially help to make seriously posts I would state. This is the first time I frequented your website page and thus far? I surprised with the research you made to make this particular publish extraordinary. Excellent job!

  10. Thanks for another informative site. Where else could I get that kind of information written in such a perfect way? I’ve a project that I am just now working on, and I’ve been on the look out for such information.

  11. The subsequent time I read a blog, I hope that it doesnt disappoint me as a lot as this one. I mean, I know it was my option to learn, however I truly thought youd have something interesting to say. All I hear is a bunch of whining about something that you may fix when you werent too busy in search of attention.

  12. Aw, this was an extremely good post. Taking a few minutes and actual effort to produce a good articleÖ but what can I sayÖ I put things off a whole lot and don’t manage to get nearly anything done.

  13. I do not even know how I ended up here, but I thought this post wasgreat. I do not know who you are but definitely you are going to a famous blogger if you are not already😉 Cheers!

  14. I like the helpful information you provide in your articles.I will bookmark your blog and check again here regularly.I’m quite sure I will learn many new stuffright here! Good luck for the next!

  15. I’ve been using na beer with thc regular in regard to over a month for the time being, and I’m indeed impressed by the sure effects. They’ve helped me determine calmer, more balanced, and less tense throughout the day. My saw wood is deeper, I wake up refreshed, and even my pinpoint has improved. The quality is excellent, and I appreciate the natural ingredients. I’ll categorically keep buying and recommending them to everybody I recall!

  16. I’ve been using thc hybrid edibles daily in regard to all about a month now, and I’m truly impressed during the uncontested effects. They’ve helped me perceive calmer, more balanced, and less anxious everywhere the day. My saw wood is deeper, I wake up refreshed, and straight my nave has improved. The attribute is famous, and I cognizant the natural ingredients. I’ll positively keep buying and recommending them to the whole world I know!

Leave a Reply

Your email address will not be published. Required fields are marked *

Loading…

0

Kakai, determindong sirain si Shaina sa explosive season finale ng Goodwill!

7-Eleven Introduces The Ultimate Rainy Season Comfort Food: Imperial Big Meal Noodles