Why Is My HDMI Port Not Detecting a Display?

Disclosure: When you buy something through links on our site, we may earn an affiliate commission.

Help & How To

Usually a connection, driver, or display settings issue — here’s how to find it and fix it


Plugging an HDMI cable into your computer and getting no signal on the connected monitor or TV — a black screen, a “no signal” message, or the display not appearing in your display settings — is a common problem with several distinct causes.

The fix depends on whether the issue is with the cable, the port, the drivers, or the display settings.

Here’s how to work through it systematically.


Check the Physical Connection First

Before any software troubleshooting, verify the physical connection is solid. HDMI cables that aren’t fully seated produce no signal even when everything else is working correctly.

Unplug the HDMI cable from both ends — from the computer and from the monitor or TV. Wait 10 seconds. Plug it back in firmly at both ends. You should feel or hear a slight click when the connector seats properly.

Also check the specific port you’re using. If your computer has multiple HDMI ports, try a different one. If it has both HDMI and DisplayPort, try a different cable and port type altogether to rule out the specific port being the issue.

Check the monitor or TV input selection. Most displays have multiple input sources — HDMI 1, HDMI 2, DisplayPort — and need to be manually switched to the correct input. Use the display’s physical buttons or remote to select the input that matches where the cable is connected.


Try a Different HDMI Cable

HDMI cables fail more often than people expect — particularly older cables, cables that have been bent sharply, or cables that have been connected and disconnected many times. A cable that works for audio and basic video may fail for higher resolutions or refresh rates.

Swap the cable for a known-working one and test. If the display appears with the replacement cable, the original cable is faulty. Replace it with a cable rated for your resolution — HDMI 2.0 for 4K at 60Hz, HDMI 2.1 for 4K at 120Hz or 8K.


Restart Both Devices

A full restart of both the computer and the display clears temporary states that cause detection failures. Some displays require a specific initialization sequence that gets out of sync — a restart resets it.

Turn off the monitor or TV completely using its power button — not just standby. Turn off the computer. Wait 30 seconds. Turn the display on first, then turn on the computer. This order matters — some systems detect displays during boot and need the display to be ready before the computer starts.


Force Windows to Detect the Display

Windows sometimes fails to automatically detect a newly connected display even when the hardware connection is fine. You can manually trigger detection.

Right-click the desktop and select Display Settings. Scroll down and click Detect under the Multiple Displays section. Windows scans for connected displays and should pick up the HDMI-connected one.

Also press Windows + P to open the projection menu. Try switching between PC Screen Only, Duplicate, Extend, and Second Screen Only. Sometimes toggling through these modes triggers the display to be recognized.

On laptops, the function key combination for display output — often Fn + F4, Fn + F5, or Fn + F8 depending on the model — can also trigger display detection.


Update or Reinstall Display Drivers

Outdated or corrupted GPU drivers are one of the most common causes of HDMI detection failures. The driver manages how the GPU communicates with connected displays — when it’s outdated or corrupted, the GPU may fail to initialize the HDMI port correctly.

Open Device Manager by right-clicking the Start button. Expand Display Adapters and right-click your GPU. Select Update Driver → Search Automatically for Drivers.

For a more reliable update, download the driver directly from your GPU manufacturer:

  • NVIDIA: nvidia.com/drivers
  • AMD: amd.com/support
  • Intel: intel.com/content/www/us/en/download-center

For NVIDIA and AMD dedicated GPUs, use DDU — Display Driver Uninstaller from guru3d.com for a clean installation. Boot into safe mode, run DDU to completely remove the existing driver, restart normally, and install the fresh driver. A clean install eliminates residual corrupted files that a standard update leaves in place.

Restart after installing and test HDMI detection.


Check Whether You’re Using the Right HDMI Port on a Desktop

Desktops with a dedicated GPU have two sets of display outputs — ports on the GPU itself and ports on the motherboard. The motherboard ports only work when using integrated graphics. If you have a dedicated GPU installed, you must connect your monitor to the GPU’s ports — the ones on the card itself, accessible at the back of the computer below the motherboard ports.

Plugging into the motherboard’s HDMI port when a dedicated GPU is installed produces no signal in most configurations because the motherboard’s integrated graphics are disabled by the GPU.

Check which ports you’re using. The GPU’s ports are on a separate bracket below the motherboard I/O panel — look for the display ports on the card that’s installed in the PCIe slot.


Check the HDMI Port for Physical Damage

HDMI ports can become damaged — bent pins inside the port, loose connector mounts, or corrosion on the contacts. Inspect the HDMI port on your computer and on the display with a flashlight. Look for:

Bent or missing pins inside the port. Debris or dust blocking the connection. The port feeling loose or wobbly rather than solid. Visible corrosion or oxidation on the metal contacts.

If the port on the computer is damaged, the fix depends on the device — a desktop’s discrete GPU with a damaged HDMI port can be replaced. A laptop with a damaged HDMI port may require motherboard repair or using a USB-C to HDMI adapter as a workaround.


Check GPU and Integrated Graphics Settings in BIOS

On desktops with both integrated and dedicated graphics, BIOS settings control which is active. Some BIOS configurations disable the iGPU entirely when a dedicated GPU is present — or vice versa — causing specific ports to produce no signal.

Restart and enter BIOS by pressing F2, Delete, F10, or Escape during startup. Look for graphics-related settings — Primary Display Adapter, Initiate Graphics Adapter, or IGD Multi-Monitor. If you want to use both integrated and dedicated graphics ports simultaneously, enable IGD Multi-Monitor or the equivalent. If you only have a dedicated GPU, ensure it’s set as the primary display.


Check HDCP Compatibility

HDCP (High-bandwidth Digital Content Protection) is a copy protection standard that some content sources enforce. If either the source (computer) or the display doesn’t support the required HDCP version, content won’t display — sometimes showing a black screen or error rather than no signal at all.

This is more commonly the cause of a black screen on a connected display rather than the display not being detected at all. Streaming apps, Blu-ray players, and some game consoles enforce HDCP. If the display is detected but shows black when playing specific content, HDCP incompatibility is likely.

Check that your display supports HDCP 2.2 for 4K protected content. This is listed in the display’s specifications.


Test With a Different Display

To isolate whether the issue is with the computer or the display, connect the HDMI cable to a completely different monitor or TV. If the second display works, the original display has an issue — either its HDMI port, its input settings, or an internal fault.

If neither display works with the same computer, the issue is on the computer side — the HDMI port, the GPU, or the drivers.


Check Resolution and Refresh Rate Compatibility

If the display is detected briefly but then goes black, or if it shows a “signal out of range” message, the resolution or refresh rate set on the computer exceeds what the display supports.

Boot into safe mode (press F8 during startup or hold Shift while clicking Restart). In safe mode, Windows uses a basic display driver at a low resolution that any display supports. If the display works in safe mode, the resolution or refresh rate set in normal mode is incompatible with the display.

In safe mode, go to Display Settings → Advanced Display Settings and lower the resolution or refresh rate. Restart normally and test.


Check for a Loose or Failing GPU

On desktop computers, a GPU that isn’t fully seated in the PCIe slot can cause intermittent or complete HDMI detection failures. The card may appear installed but a loose connection at the slot causes unreliable signal.

Power down the computer and unplug it. Open the case and locate the GPU. Press down firmly on the card to ensure it’s fully seated in the PCIe slot — you should feel it click into place. Check that the PCIe power connectors are fully plugged into the card. Tighten the retaining screw at the bracket.

Power the computer back on and test HDMI detection.


Try a USB-C to HDMI Adapter

If your computer has USB-C or Thunderbolt ports, a USB-C to HDMI adapter or cable bypasses the built-in HDMI port entirely. If the adapter works and the built-in port doesn’t, the built-in port is faulty.

This also serves as a workaround if the built-in HDMI port is physically damaged — USB-C to HDMI adapters provide full HDMI functionality through the USB-C port.


Check Windows Event Viewer

Windows Event Viewer records display-related errors that can identify the specific cause of HDMI detection failures.

Press Windows + R, type eventvwr.msc, and press Enter. Navigate to Windows Logs → System. Filter or search for errors with sources like display or igfx that appeared when you connected the HDMI cable. The error details often point directly at whether the issue is driver-related, hardware-related, or a configuration problem.


macOS-Specific Checks

On Mac, HDMI detection issues have some platform-specific causes and fixes.

Go to Apple Menu → System Settings → Displays and click Detect Displays while holding the Option key — this forces a more thorough display scan than the standard detect.

For M-series Macs, HDMI port availability depends on the chip — M1 MacBook Air supports one external display, M1 Pro and higher support multiple. Connecting more displays than the chip supports causes the extras to not be detected.

Also check whether the Mac needs to be restarted after connecting the display — some Mac configurations require a restart to initialize a newly connected display correctly.


A Quick Checklist

Work through these in order:

  • Check physical connections — reseat cable at both ends
  • Switch display input source on the monitor or TV
  • Try a different HDMI cable
  • Restart both devices — display first, then computer
  • Press Windows + P and cycle through display modes
  • Right-click desktop → Display Settings → Detect
  • Check which HDMI port you’re using on desktops — use GPU ports not motherboard ports
  • Update or clean-reinstall GPU drivers
  • Inspect HDMI port for physical damage
  • Check BIOS graphics settings for iGPU and dedicated GPU configuration
  • Test with a different display to isolate computer vs display issue
  • Boot safe mode to test at lower resolution if display goes black
  • Reseat GPU in PCIe slot on desktops
  • Try USB-C to HDMI adapter as a workaround or diagnostic

The Bottom Line

HDMI not detecting a display is almost always caused by a faulty cable, using the wrong HDMI port on a desktop with a dedicated GPU, outdated drivers, or Windows not automatically detecting the display. Trying a different cable, ensuring you’re using the GPU’s HDMI port rather than the motherboard’s, and manually triggering detection through Display Settings together resolve the majority of cases.

Driver issues are the most common software cause — a clean GPU driver reinstall using DDU resolves detection failures that standard driver updates sometimes miss.

HDMI doesn’t detect the display when the connection, driver, or settings aren’t aligned — fix whichever of the three is off and the display appears.

Leave a Comment