Hello,
I have a raidz2-0 array with 5 drives + 1 spare.
The case has 8 HD slots.
After one of the drives failed, I have installed a new, larger drive in one of the empty slots, and did
I have 2 pools on this array and both of them are restored after
I then replaced another one of the old drives with the same procedure, and both pools are fine.
However, only now noticed that
Although the computer boots from the zroot pool and everything else seems to be working fine, I'm wondering what kind of trouble am I getting into by having the same labels on different disks?
I have a raidz2-0 array with 5 drives + 1 spare.
The case has 8 HD slots.
After one of the drives failed, I have installed a new, larger drive in one of the empty slots, and did
gpart backup ada0 > ada0.backup
and then gpart restore -l ada3 < ada0.backup
before removing the failing drive. I have also added bootcode to the new drive.I have 2 pools on this array and both of them are restored after
zpool replace
.I then replaced another one of the old drives with the same procedure, and both pools are fine.
However, only now noticed that
gpart restore
of course restored the label as well, and now labels on the 6 disks look like this:
Code:
# gpart list | grep label
label: (null)
label: disk0
label: data0
label: (null)
label: disk1
label: data1
label: (null)
label: disk0
label: data0
label: (null)
label: disk3
label: data3
label: (null)
label: disk4
label: data4
label: (null)
label: disk0
label: data0
Although the computer boots from the zroot pool and everything else seems to be working fine, I'm wondering what kind of trouble am I getting into by having the same labels on different disks?