ESXi 5.5 on a raid 5 volume - Disk Failure

tstrip007tstrip007 Member Posts: 308 ■■■■□□□□□□
Have an esxi5.5 install running in a raid 5 volume. A disk failure occurred in the array and caused esxi to psod. I unfortunately didn't grab a screen of the psod and im working on getting the **** file to examine.

But just wanted ping you guys and see if this is normal behavior, I was able to replace the drive quickly and esxi came back up and rebuilt just fine but didnt know if this was normal behavior for it psod in the event of a disk failure.

Thanks for any feedback. Perhaps im being stupid and overlooking some simple fundamental thing. But i assumed the hardware status tab would alert me of the drive failure, i replace, it rebuilds, no interruption with esxi.

Comments

  • CoolhandlukeCoolhandluke Member Posts: 118
    Hi,

    I'm no VMware expert by any means but at work we run around 10 VMware servers (a mix of 5.0 and 5.5). We have had a few disks fail over the years (can think of 2 instances of Raid 5 - we are in the process of moving all services to Raid 10 going forward). The alert fires up and server stays stable - although one HP server's fans go crazy so that in itself is an alert, replace the disk - rebuild and everything continues, no down time.

    Are you running a custom vm build ? what hardware ? i'm guessing DAS ? think/thick disks ?

    Hope this helps.
    [CCENT]->[CCNA]->[CCNP-ROUTE]->COLOR=#0000ff]CCNP SWITCH[/COLOR->[CCNP-TSHOOT]
  • joelsfoodjoelsfood Member Posts: 1,027 ■■■■■■□□□□
    Disk failure in a an otherwise healthy server running raid5 definitely should not have caused a PSOD. Get those logs and try to figure out what happened. Also look for any particular firmware/driver updates for your server on VMware, could have hit a bug.
Sign In or Register to comment.