Intel Graphics Card: Safe to Disable?

Here’s what happens if you disable your Intel graphics card:

There are a lot of ways this can go, and they depend on how your computer is built and set up. 

If you have another graphics card, you won’t notice a difference. 

If you don’t have another card, disabling the Intel card kills your video. 

Ultimately, it doesn’t damage the computer, but it can make things complicated.

So if you want to learn all about what happens exactly when you disable your Intel graphics card, then this article is for you.

Keep reading!

Intel Graphics Card: Safe to Disable? (Everything to Know)

What Is the Intel Graphics Card?

modern teen girl with headphones using laptop for leisure

Intel makes a lot of different bits of hardware, so this could mean a lot of things. 

In the vast majority of cases, an Intel graphics card is going to be something known as an integrated graphics card. 

Intel has made discrete graphics cards in the past (more than 20 years ago), and the company has started making them again

But, if you have an Intel graphics card, it is overwhelmingly likely to be integrated with the CPU.

Let’s talk about what that really means.

An integrated graphics card is built into your central processing unit (CPU).

It’s designed to work in tandem with the dedicated CPU hardware in order to make the computer function. 

More specifically, the graphics processing unit (GPU) handles the work necessary to produce video that you see on your screen. 

The CPU handles the rest of the calculations that make the computer work. (Technically speaking, the lines between the division of work between these hardware bits can blur, but this is a close enough description).

When you use an integrated GPU, there is no discrete computer component that you can swap in or out for upgrades or replacements. 

Because it is “integrated,” replacing the graphics card would require you to replace the whole CPU. 

Even though the CPU and GPU that are physically integrated into the same chip serve different roles, you can’t physically remove one from the computer unless you remove both.

It’s one physical component.

The thing to understand is that a computer can have a discrete graphics card instead (which, again, is very likely not made by Intel). 

In this case, there is a removable computer component that handles graphics processing. 

Discrete graphics cards are usually much more powerful than integrated cards, and they are preferred for high-end gaming as well as photo and video editing.

Because discrete cards can be added to a computer, it’s possible to have both an integrated card and a discrete card

If you do, you could conceivably disable the integrated Intel card without impacting the discrete card and vice versa.

All of this is to say that if you disable your Intel graphics card, the results depend entirely on how your computer is set up and the specifics of your computer build. 

So, we’ll be keeping all of this in mind as we get deeper into the topic.

Why Would You Disable Your Intel Graphics Card? (2 Scenarios)

Pensive girl viewing a video on the laptop

Before we can really go over what happens when an Intel graphics card is disabled, it helps to think about why you would even do this. 

Even if you have a discrete graphics card, you don’t really need to disable the Intel card. 

Each card works independently, so you can use the one you prefer without disabling or otherwise interacting with the unused card.

So, regardless of the situation, disabling the card is abnormal. 

That said, there are a few times when it might make sense to disable the card, and knowing more about that can help you understand the bigger picture.

#1 You Have Another Card

young woman using laptop serious face

If you have a discrete card, you can disable the Intel card, and you shouldn’t notice any change (a major exception to this will be discussed later). 

As long as your displays are connected correctly, you can run video from the discrete card, and everything is fine.

But, why would you disable the Intel card?

For the most part, there are two reasons. The first is that you want to save power. 

Especially if you’re using a laptop, disabling the Intel card can potentially save battery life. 

Even if you have a desktop, this could lower your power consumption by a little. 

It can help if you have problems with overheating, and it’s nice if you’re trying to absolutely minimize your electric bill.

The other reason is if the Intel card is misbehaving. 

Problems with the Intel card can potentially impact general computer performance. 

So, if you’re able to isolate the problem to the Intel card, you can simply disable it, and your problem is solved.

#2 You Are Troubleshooting or Testing

Man concentrating while working on his laptop.

Outside of the two ideas listed above, you might want to temporarily disable your Intel card for the sake of troubleshooting or testing.

With troubleshooting, this is a common practice. You can install a discrete card and disable the Intel card in order to see if any specific problem was related to the Intel card. 

Generally, this is a step you might consider if you’re having video issues on the computer.

Disabling the Intel card also might help with testing. 

If you’re trying to run benchmark tests or other detailed tests for your specific computer setup, you can compare statistics when the Intel card is enabled vs disabled. 

It’s not necessary for typical use, but the small energy consumption of the Intel card might help you really minimize and maximize certain performance statistics.

How Do You Disable Your Intel Graphics Card?

Young woman using laptop at home sitting on the sofa with hand on chin thinking about something.

So, assuming you have a good reason to try disabling the Intel graphics card, how is that done?

If you have a Windows machine, you can do it in the Windows settings

You want to get to the “Display Adapters” setting. 

You can do that by typing it in the search bar in the Windows menu. 

If you don’t use the search feature, you can find it by navigating this way: Start > Control Panel > System > Device Manage > Display Adapters.

Once you have made it to the Display Adapters menu, you can right-click on the display that you see there (it should have Intel in its name). 

You’ll get a drop-down menu when you right-click, and you can choose to disable it. 

As a rule, don’t uninstall it. That’s a lot harder to undo.

If you’re not using Windows for this purpose, you can also disable the Intel graphics card in the BIOS. 

Getting into BIOS varies depending on how your computer was made, so it’s easiest to look that part up for your specific machine. 

Once in BIOS, you can look for display options or graphics cards. 

In the appropriate settings, you should be able to find the Intel card and disable it.

Turning It Back On

If you want to enable a disabled graphics card, you can reverse the steps above. 

You can use the Windows Display Adapters menu, or you can go back through BIOS. 

Either way, the option to “enable” will replace the “disable” option that you used in the previous section.

After enabling a new graphics card, it’s usually best to restart the computer so the changes can take full effect.

What Happens When Your Intel Graphics Card Is Disabled?

Closeup shot of hands using a laptop.

Ok. You have a better idea of how the Intel graphics card works. 

So, let’s really hammer down the original question. 

What happens if you disable the Intel card? 

If you have a discrete graphics card, then you won’t really notice any difference. 

You were using the other card anyway, and that’s still going to be the case.

If you don’t have a discrete card, then disabling the Intel card is a terrible idea. 

It means you won’t get any video at all, and the computer largely becomes useless. 

Now, this doesn’t technically break the computer. 

You can re-enable the graphics card, but since you can’t see what you’re doing, it’s a challenge.

We’ll talk about how to overcome that problem in a bit.

There’s also a third scenario, and it’s specific to a special kind of setup. 

If your Intel card is working in tandem with a second card, then disabling the Intel card can hurt your computer’s performance. 

The clearest example of this is with Nvidia Optimus.

The Special Case of Nvidia Optimus

Some laptops have an Intel graphics card and an Nvidia card. They also might work together using Optimus

In this case, the two cards are intended to work together, and Optimus is the controller that directs them.

Basically, the Nvidia card handles some of the processing to reduce the load on the Intel card. 

But, calculations carried out by the Nvidia card are ultimately sent to the Intel card before they are delivered to your screen.

If you disable the Intel card with this setup, you just get a huge reduction in graphics processing. 

The display resolution and frame rates will likely go down considerably. You’ll end up with a blurry, choppy picture, no matter what you do.

Optimus setups are not intended for the Nvidia card to work all on its own. 

Removing the Intel backbone from your graphics processing creates these issues, and it can actually increase battery drain and hurt general computer performance. 

If your computer uses Optimus, it’s best to leave the Intel card running.

What If the Intel Graphics Card Is Your Only Card?

attractive Asian woman using laptop at home office and thinking

If the Intel card is your only card and you disable it, you won’t get any video to your screen or monitor(s). 

The graphics card is vital to video output.

Without it, you have a blank screen.

As you can imagine, this creates a lot of problems.

So, the simple solution is to turn the graphics card back on. 

The problem is that the standard methods to do that require a graphical user interface. 

If you can’t see what you’re doing, it gets very hard to navigate menus and turn your Intel card back on.

There are two ways to try to go about this (or you can offload the problem to an IT pro). 

The first is by doing blind keystrokes. 

You can look up instructions on another device (smartphones are convenient for this). 

Carefully following those instructions, you can use keystrokes to navigate the menus (even though you can’t see them) and re-enable your graphics card. 

Obviously, it’s easy to mess up along the way, so this is not a straightforward or ideal situation.

Another viable option is to write a script. 

You’ll need another device for this. 

On the other device you can write a script that automatically navigates the menus and enables the graphics card on the affected computer. 

You will likely need to load that script onto a bootable flash drive. 

Start up the computer with the flash drive plugged in, and it should be able to automate the process and restore your graphics card. 

There are too many things to list that can complicate this process, so do your homework.

Why Does Your Laptop Have Two Graphics Cards?

Young beautiful woman sitting home working on laptop in earphones

Now you know everything about Intel’s graphics cards and if you can disable them safely.

But how did you end up exactly with two graphics cards, though?

There are two likely reasons why a laptop would have more than one graphics card.

The first is that there is an integrated GPU and a dedicated graphics card because it was the cheapest way for a manufacturer to meet specifications.

The second is so the cards can work together for a dramatic performance upgrade.

Learn all about why your computer has two graphics cards here.

Author

  • Theresa McDonough

    Tech entrepreneur and founder of Tech Medic, who has become a prominent advocate for the Right to Repair movement. She has testified before the US Federal Trade Commission and been featured on CBS Sunday Morning, helping influence change within the tech industry.