JetBrains Rider 2024.3 Help

Chat with AI

Use the AI Assistant tool window to have a conversation with the LLM (Large Language Model), ask questions about your project, or iterate on a task.

AI Assistant takes into consideration the language and technologies used in your solution, as well as local changes and version control system commits. You can search for files, classes, and element usages.

Start a new chat

  1. Click AI Assistant on the right toolbar to open AI Assistant.

  2. In the input field, type your question.

    If you have a piece of code selected in the editor tab, use /explain and /refactor commands to save time when typing your query.

    Use the /docs command to ask JetBrains Rider-related questions. If applicable, AI Assistant will provide a link to the corresponding setting or documentation page.

  3. If you want to attach a particular file or function to your query to provide more context, use #:

    • #thisFile refers to the currently open file.

    • #localChanges refers to the uncommitted changes.

    • #file: invokes a popup with selection of files from the current project. You can select the necessary file from the popup or write the name of the file (for example, #file:Foo.md) .

    • #symbol: adds a symbol into prompt (for example, #symbol:FieldName).

    • #schema: refers to a database schema. You can attach a database schema to enhance the quality of generated SQL queries with your schema's context.

  4. In the input field, select your preferred AI chat model from the list of currently available models by clicking .

    If you want to connect the AI Assistant chat to your local model, refer to this chapter.

  5. Press Enter to submit your query.

    JetBrains Rider: Asking AI Assistant programming-related questions
  6. Click Regenerate this response at the beginning of the AI Assistant's answer to get a new response to your question.

AI Assistant keeps the chats' history separately for each project across IDE sessions. You can find the saved chats in the All Chats list.

Names of the chats are generated automatically and contain the summary of the initial query. Right-click the chat's name to rename it or delete it from the list.

Manage the smart chat mode

To give more precise answers, AI Assistant has the smart chat mode enabled by default.

In this mode, AI Assistant might send additional details, such as file types, frameworks used, and any other information that may be necessary for providing context to the LLM.

  • To disable the smart chat mode, clear the Enable smart chat mode checkbox in Settings | Tools | AI Assistant.

    Enable smart mode option in the settings

Connect AI Assistant chat to your local LLM

If you do not want to use cloud-based models while working with the AI Assistant chat, you can connect your local LLM available through Ollama.

  1. Press Ctrl+Alt+S to open settings and then select Tools | AI Assistant.

  2. In the Third-party AI providers section, select the Enable Ollama checkbox, specify your local host URL, and click Test Connection.

  3. When working with the AI Assistant chat, select your model from the list of available LLMs.

Using AI Assistant to retrieve context-based answers

  1. Click AI Assistant on the right toolbar to open AI Assistant.

  2. Use natural language to request information based on the context of your workspace. Here are some examples:

    • Request recent files: to retrieve a list of files you have recently viewed.

    • Ask for the current file: to display the full content of the currently opened file.

    • Request visible code: to retrieve the code currently visible in your editor.

    • Ask for local changes: to display uncommitted changes in your file tree.

    • Find information in README: to searche for relevant information within README files.

    • Check recently changed files: to list files modified in the ten latest commits.

Create file from snippet

You can create a new file with the AI-generated code right from the AI Assistant chat.

  • In the upper-right corner of the field with the generated code, click Create File from Snippet.

    Create File from Snippet action

    AI Assistant will create a file with the AI-generated code.

    If you have any file opened or selected in the Solution explorer Alt+1, the new file will be created in the same project as the selected file.

Attach database schema

You can enhance the quality of generated SQL queries with the context of a database schema that you are working with. To do that, attach the schema in AI Assistant tool window. AI Assistant will get access to the structure of the attached schema, providing the LLM with information about it.

To use this feature, you need to grant AI Assistant consent to access the database schema.

Attaching a schema will also improve the results of the context menu AI Actions actions group, for example, Explain Code, Suggest Refactoring, and so on. For more information about those actions, refer to Use AI prompts to explain and refactor your code.

  1. In the AI Assistant tool window input field, enter your prompt with # followed by the schema name. For example: Give me a query to get all actor names from #public.

    Attaching database schema to AI chat by mentioning it in the prompt
  2. Press Enter.

AI Assistant will analyze your schema and generate the result.

AI generated code for the entered prompt that mentioned a schema

You can see which schema was attached to your message, and also navigate to that schema in Database tool window. To do that, click the Attached elements icon Attached elements in your message, then click the schema name.

If you want to allow AI Assistant to always attach the selected schemas, select the Always allow attaching database schemas checkbox in the Attach Schema dialog. Alternatively, enable the Allow attaching database schemas setting in Settings | Tools | AI Assistant.

Last modified: 20 November 2024