DeveloperBreeze

If your NVIDIA GPU isn’t detected properly on Ubuntu, or nvidia-smi shows an error like "couldn't communicate with the NVIDIA driver", this guide will walk you through how to fix that.

This was tested and confirmed working on a Dell Vostro 3521 with an NVIDIA GeForce MX350 GPU.


🧩 Step 1: Detect Your GPU

Open a terminal and run:

lspci -k | grep -EA3 'VGA|3D|Display'

You’ll see something like:

00:02.0 VGA compatible controller: Intel Corporation ...
	Kernel driver in use: i915
01:00.0 3D controller: NVIDIA Corporation GP107M [GeForce MX350]
	Kernel driver in use: nouveau

✅ This means your system has both Intel integrated graphics and an NVIDIA discrete GPU.


🧪 Step 2: Check OpenGL Renderer

Install Mesa utilities:

sudo apt install mesa-utils
glxinfo | grep "OpenGL renderer"

You’ll likely see:

OpenGL renderer string: Mesa Intel(R) Xe Graphics

This shows the system is currently using the Intel GPU for rendering — not NVIDIA.


🚧 Step 3: Check Available NVIDIA Drivers

Run:

sudo ubuntu-drivers devices

You’ll see a list like:

driver   : nvidia-driver-550 - distro non-free recommended
...
driver   : xserver-xorg-video-nouveau - distro free builtin

✅ Make note of the recommended driver (e.g. nvidia-driver-550).


🧼 Step 4: Clean Up Old Drivers (If Any)

Let’s purge conflicting or broken drivers:

sudo apt purge 'nvidia-*'
sudo apt remove --purge xserver-xorg-video-nouveau
sudo apt autoremove

Then update:

sudo apt update

💾 Step 5: Install the Correct NVIDIA Driver

Install the recommended version:

sudo apt install nvidia-driver-550 nvidia-prime

🔐 Step 6: Disable Secure Boot (IMPORTANT!)

If Secure Boot is enabled in BIOS, the NVIDIA kernel module may be rejected silently, even if installation succeeded.

Disable Secure Boot on Dell Vostro 3521:

  1. Reboot your system.
  2. Press F2 repeatedly to enter BIOS.
  3. Go to Boot or Security tab.
  4. Set Secure Boot to Disabled.
  5. Save and exit.

🔁 Step 7: Reboot and Verify

Now reboot:

sudo reboot

After boot, check if the driver is working:

nvidia-smi

You should see output like:

+-----------------------------------------------------------------------------+
| NVIDIA-SMI 550.120       Driver Version: 550.120       CUDA Version: 12.4  |
| GPU  Name        | Bus-Id | Memory-Usage | GPU-Util |
|------------------+--------+--------------+----------|
| GeForce MX350    | 01:00.0| 5MiB / 2048MiB| 0%       |
+-----------------------------------------------------------------------------+

✅ Success! Your NVIDIA GPU is now live.


⚙️ Step 8: Use NVIDIA on Demand (prime-run)

If prime-run doesn't exist, create it:

sudo nano /usr/bin/prime-run

Paste:

#!/bin/bash
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia __GL_VRR_ALLOWED=0 "$@"

Save and make it executable:

sudo chmod +x /usr/bin/prime-run

Now test:

prime-run glxinfo | grep "OpenGL renderer"

You should see:

OpenGL renderer string: NVIDIA GeForce MX350 ...

🚀 Bonus: Run Apps Using NVIDIA GPU

Launch apps like this:

prime-run firefox
prime-run blender
prime-run vlc

These will run on the NVIDIA GPU only, saving power when not needed.


🧠 Common Issues

ProblemSolution
nvidia-smi failsMake sure Secure Boot is disabled.
Module won’t loadRun sudo modprobe nvidia and check dmesg.
Poor performanceUse prime-select nvidia to switch fully.

✅ Final Notes

  • Use prime-select query to check current mode (intel, nvidia, or on-demand).
  • Switch modes:
  sudo prime-select nvidia  # Always use NVIDIA
  sudo prime-select intel   # Use Intel only
  sudo prime-select on-demand  # Default hybrid mode
  • Reboot after switching modes.

✨ Conclusion

Getting your NVIDIA GPU working on Ubuntu (especially on hybrid laptops like the Dell Vostro 3521) can be tricky — but once you follow the right steps, it’s smooth sailing. Disabling Secure Boot and using the correct driver is key.

Continue Reading

Handpicked posts just for you — based on your current read.

Implementing a Domain-Specific Language (DSL) with LLVM and C++

#include "DSL/Parser.h"
#include <stdexcept>

Parser::Parser(Lexer& lexer) : lexer(lexer) {
    currentToken = lexer.getNextToken();
}

void Parser::eat(TokenType type) {
    if (currentToken.type == type) {
        currentToken = lexer.getNextToken();
    } else {
        throw std::runtime_error("Unexpected token: " + currentToken.text);
    }
}

std::unique_ptr<ASTNode> Parser::factor() {
    if (currentToken.type == TokenType::Number) {
        auto node = std::make_unique<NumberExprAST>(currentToken.value);
        eat(TokenType::Number);
        return node;
    } else if (currentToken.type == TokenType::LParen) {
        eat(TokenType::LParen);
        auto node = parseExpression();
        eat(TokenType::RParen);
        return node;
    }
    throw std::runtime_error("Invalid syntax in factor.");
}

std::unique_ptr<ASTNode> Parser::term() {
    auto node = factor();
    while (currentToken.type == TokenType::Asterisk || currentToken.type == TokenType::Slash) {
        TokenType op = currentToken.type;
        eat(op);
        auto right = factor();
        node = std::make_unique<BinaryExprAST>(op, std::move(node), std::move(right));
    }
    return node;
}

std::unique_ptr<ASTNode> Parser::parseExpression() {
    auto node = term();
    while (currentToken.type == TokenType::Plus || currentToken.type == TokenType::Minus) {
        TokenType op = currentToken.type;
        eat(op);
        auto right = term();
        node = std::make_unique<BinaryExprAST>(op, std::move(node), std::move(right));
    }
    return node;
}

Our AST nodes will represent numeric literals and binary operations. Later, these nodes are traversed to generate LLVM IR.

Feb 12, 2025 Tutorial

دليل شامل: تطوير تطبيقات باستخدام إطار العمل Flutter

  • تحقق من تثبيت Flutter باستخدام الأمر التالي:

Dec 12, 2024 Tutorial

Discussion 0

Please sign in to join the discussion.

No comments yet. Start the discussion!