1 Rookie

 • 

3 Posts

91

April 2nd, 2025 17:12

R630 Not Preserving NVMe BIOS Settings Over Cold Reboot

I have a PowerEdge R630 running VMware ESXi 8.   The BIOS is v2.19.0 installed in December 2024.  The machine has 2 x 1 TB SATA disks in a RAID 1 configuration connected to a PERC H730 mini controller.

In January of 2025 I added 2 x M2 1 TB Western Digital Black NVMe cards to the server.   The NVMe drives work and perform perfectly as VM storage.  Since my R630 is a home lab server I frequently turn it off for several days at a time. 

Here is my problem:  when I reboot after a couple of days of the server being off,  the R630 does not recognize the newly installed NVMe drives.  I must boot into the BIOS and go to Integrated devices and the slot bifurcation reads 4x4x4.  I must reset the bifurcation back to “default”, save the settings and reboot.   I then must enter the BIOS settings a second time, go to Integrated devices and reset the slot bifurcation back to 4x4x4, save and reboot.  The server and ESXi now fully recognize the NVMe drives and everything works without a hitch.

I have replaced the CR2032 battery on the mother board with a new one.  The system says the batteries for the mother board and PERC 730 mini are both good and in an “optimal” state.

Any idea how to make the BIOS settings stick over a couple of days of being powered off and why this might be happening?

Moderator

 • 

4.6K Posts

April 11th, 2025 20:45

Hello,

 

Are the M.2 NVMe cards the only thing that doesn't work after being shut down?

 

We don't have any M.2 listed in the R630 parts database so I wouldn't have anything to recommend you.

 

R630 spec sheet:

https://i.dell.com/sites/doccontent/shared-content/data-sheets/en/documents/dell-poweredge-r630-spec-sheet.pdf

 

As a 3rd party discussion you may check this:

https://www.reddit.com/r/homelab/comments/gfi4sk/adding_m2_to_a_dell_r630/?rdt=36973

3 Apprentice

 • 

1.1K Posts

April 2nd, 2025 19:25

thats certainly an interesting one! Is the idrac updated as well? my suggestion would be to try moving the nvram_clr jumper to the other 2 pins, power on the server, let it complete the bootup, then gracefully shut it down and move the jumper back to the other 2 pins. If they arent already, I would also suggest putting the nvme cards in slots 1 and 2. question, are you sure that 2 reboots wouldnt resolve the issue without changing the bios settings?  This last suggestion is harder to do. In the idrac, I would suggest exporting the server configuration profile in the working state, and in the failing state, and comparing the two files. I use Beyond Compare for things like this, it lets you compare 2 files side by side and shows you differences. this would let you know if the settings are actually changing behind the scenes. Doesnt solve the issue, but could be a data point. 

Rey
#Iwork4Dell

Moderator

 • 

5.2K Posts

April 3rd, 2025 03:30

Hello,

 

"I added 2 x M2 1 TB Western Digital Black NVMe cards to the server"

 

"I must boot into the BIOS and go to Integrated devices and the slot bifurcation reads 4x4x4.  I must reset the bifurcation back to “default”"

 

so during the POST it does device detection and PCI training to figure out on what speed/setting to use them, and we looked up m.2 within and nothing comes up for the part so it got me thinking if you are using a 3rd party PCIe card to host 2 x m.2 drives, in that case, there's nothing we can do about, I'm afraid.

You can send us your service tag if you like  https://www.dell.com/community/en/direct-messaging

 

Respectfully,

 

 

 

1 Rookie

 • 

3 Posts

April 11th, 2025 20:02

@DELL-Young E​ I am using a 3rd party  PCIe card for my M2.NVMe cards. It is a

Supermicro AOC-SLG3-2M2 PCIe Add-On Card for up to Two NVMe SSDs .

https://www.amazon.com/dp/B071S3ZY8P?ref=ppx_yo2ov_dt_b_fed_asin_title

If this card is causing my issue can you point me to an inexpensive option that would work correctly?   This is a home lab server for running VMware VCF.

When I reset the slot bifurcation the M2 NVMe drives are recognized and work and run  flawlessly.   When I shutdown for a few days and restart I have the following 2 errors in the lifecycle log:

PR8: Device not detected: PCIeSSD(Slot 1-2)
 2025-04-10T22:01:22-0500
Log Sequence Number: 37324
Detailed Description:
The indicated device has been removed from the system, is no longer functioning, or has been disabled.
Recommended Action:
If the part is present, make sure it is properly installed. Ensure that the device has not been disabled in the system BIOS or other configuration utility. Contact technical support if the device is still non functional.
PR8: Device not detected: PCIeSSD(Slot 1-1)
 2025-04-10T22:01:21-0500
Log Sequence Number: 37323
Detailed Description:
The indicated device has been removed from the system, is no longer functioning, or has been disabled.
Recommended Action:
If the part is present, make sure it is properly installed. Ensure that the device has not been disabled in the system BIOS or other configuration utility. Contact technical support if the device is still non functional.
Thanks in advance for you time and help!

1 Rookie

 • 

3 Posts

April 21st, 2025 17:24

@DELL-Charles R​ It the only the M.2 NVMe cards that are lost everything else functions normally.   When sitting over a few days I have the following messages for slot 1-1 and slot 1-2 in the system lifecycle log when rebooting:

PR8: Device not detected: PCIeSSD(Slot 1-1)
 2025-04-12T20:55:04-0500
Log Sequence Number: 37477
Detailed Description:
The indicated device has been removed from the system, is no longer functioning, or has been disabled.
Recommended Action:
If the part is present, make sure it is properly installed. Ensure that the device has not been disabled in the system BIOS or other configuration utility. Contact technical support if the device is still non functional.
Comment: root
I boot back into the BIOS, go to slot bifurcation and it is still set to 4x4x4.  I set it to "default", save it, and then boot back into the slot bifurcation again  and reset it to 4x4x4.   I save the changes and reboot and the NVMe cards are recognized and work fine.

Moderator

 • 

4.6K Posts

April 21st, 2025 18:57

Hello,

 

Have you put it in different boot modes? UEFI or BIOS to test?

 

Is your card up to date on its firmware if it has any? Check with manufacturer.

 

Is the R630 up to date on firmware?

 

PowerEdge: Update Firmware Via HTTPS Using iDRAC 7-8-9

Use 1. How to perform a manual update

https://www.dell.com/support/kbdoc/en-us/000130533/dell-poweredge-how-to-update-the-firmware-via-https-connection-to-idrac#manual

 

Do you have another R630 or any other system you can test your cards in?

 

I don't have much more I can provide. We don't have any M.2 NVMe listed in the R630 parts database.

 

This set up with 3rd party parts would not be supported by Dell.

No Events found!

Top