0
Please log in or register to do it.

NVIDIA has launched an artificial intelligence (AI)-based chatbot called Chat with RTX that runs locally on your PC and doesn’t require an internet connection. GPU manufacturers have been leading the AI ​​industry with advanced AI chips that power AI products and services since the generative AI boom. Nvidia also has an AI platform that provides end-to-end solutions for enterprises. The company is currently building its own chatbot, with Chat with RTX being its first product. Nvidia Chatbot is currently a free demo app.

Nvidia calls it a personalized AI chatbot and launched the tool on Tuesday (February 13). Users wishing to download the software will need a Windows PC or workstation running on an RTX 30 or 40 series GPU with at least 8GB of VRAM. After downloading, you can install the app in just a few clicks and start using it right away.

Because Chat with RTX is a local chatbot, it has no knowledge of the outside world. However, users can customize it to provide their own personal data such as documents, files, etc. and run queries. One such use case is to provide them with a large volume of work-related documents and then ask them to summarize, analyze, or answer specific questions that could take hours to find manually. Likewise, it can be an effective research tool for skimming through multiple studies and papers. Supports text, PDF, doc/docx and xml file formats. The AI ​​bot also accepts YouTube video and playlist URLs and can use the video’s script to answer queries or summarize the video. Internet access is required to use this feature.

According to the demo video, Chat with RTX is essentially a web server with an instance of Python that does not include information from the Large Language Model (LLM) when newly downloaded. Users can choose to train either Mistral or Llama 2 models and then run queries using their own data. The company said the chatbot leverages open source projects such as Search Augmented Generation (RAG), TensorRT-LLM, and RTX acceleration for its functionality.

According to a report from The Verge, the app size is around 40GB and the Python instance can take up up to 3GB of RAM. One particular issue pointed out by the publication is that the chatbot generates JSON files within the folder you requested to index. So feeding an entire document folder or a large parent folder can be a hassle.

Affiliate links may be generated automatically. Please see our Ethics Statement for more information.

Manchester United 'in contact' with Blockbuster
The Daegu City Medical Association also formed an emergency committee... “We will fight to establish a proper medical system.”

Reactions

0
0
0
0
0
0
Already reacted for this post.

Reactions

Your email address will not be published. Required fields are marked *