Friday, June 3, 2016

What disk was ESXi installed to?

I was talking with a colleague on how to find out which device ESXi was installed to. This is important especially when 

1) you don't have documentation and are adding a new host and you want to be consistent
2) are doing a hardware change and are not sure if replacing a SD card or disks could also remove ESXi

Understanding this means you need to understand how ESXi partitions whatever disk device you gave it for installation. ESXi creates several partitions, and if there's free space, will create a VMFS partition on the remaining disk space and call it datastoreX. From the vCenter/ESXi console, the administrator is mostly looking at datastores, not realizing they are partitions.

Datastores are type VMFS partitions, and the ESXi installation files are stored on partitions of type vfat. When we ask ESXi where it's installed, the answer is a partition; however, we are probably more interested in getting the disk information. Typically, this means we need to translate the partition to the naa ID of the disk. 

Fastest way i've found:

You need three (for just device info) or four commands if you want to know what datastore was also created on that same disk device. Follow the colors to know what to use for command inputs:

1 ~ # esxcfg-info -b

dea75b72-115e5271-3e7b-9b3ef7301455

2 ~ # esxcfg-scsidevs -f | grep dea75b72-115e5271-3e7b-9b3ef7301455

naa.6b8ca3a0e704b3001bb807e52a66aad5:5                           /vmfs/devices/disks/naa.6b8ca3a0e704b3001bb807e52a66aad5:5 dea75b72-115e5271-3e7b-9b3ef7301455

3 ~ # esxcli storage core device list -d naa.6b8ca3a0e704b3001bb807e52a66aad5

naa.6b8ca3a0e704b3001bb807e52a66aad5
   Display Name: Local DELL Disk (naa.6b8ca3a0e704b3001bb807e52a66aad5)
   Has Settable Display Name: true
   Size: 6673408
   Device Type: Direct-Access
   Multipath Plugin: NMP
   Devfs Path: /vmfs/devices/disks/naa.6b8ca3a0e704b3001bb807e52a66aad5
   Vendor: DELL
   Model: PERC H710
   Revision: 3.13
   SCSI Level: 5
   Is Pseudo: false
   Status: on
   Is RDM Capable: false
   Is Local: true
   Is Removable: false
   Is SSD: false
   Is Offline: false
   Is Perennially Reserved: false
   Queue Full Sample Size: 0
   Queue Full Threshold: 0
   Thin Provisioning Status: unknown
   Attached Filters:
   VAAI Status: unsupported
   Other UIDs: vml.02000000006b8ca3a0e704b3001bb807e52a66aad5504552432048
   Is Local SAS Device: false
   Is USB: false
   Is Boot USB Device: false


   No of outstanding IOs with competing worlds: 32

Now you know the Display Name, Size (MB, meaning the above is 6.36TB), Vendor and Model of the device where ESXi was installed (in the case of a RAID disk, you get the controller details).

However, you may still not be sure of what that device is. Since we tend to think of datastores, we can issue one more command to find out which VMFS datastore is on that same disk:

4 ~ # esxcli storage vmfs extent list | grep naa.6b8ca3a0e704b3001bb807e52a66aad5

RAID6-9x1TB-SAS     54255423-f706a47a-6e39-90b11c4fcfc5              0  naa.6b8ca3a0e704b3001bb807e52a66aad5          3

And there you have it. You probably remember which virtual disk ties to a datastore name.

Sadly, I haven't yet found a way to assure you that you have picked the right device, if, for example, you have two identical virtual drives. I wish there was a command in racadm that showed the naa (SasAddress) of a virtual disk, but I have not found it. The closest I found was 

racadm raid get vdisks -o

and it doesn't show the SasAddress, although it's clearly shown in the ESXi installation steps (this is from another server so the naa IDs don't match, but you see my point):


By the way, getting the naa identifier of a physical disk from racadm is easy 
racadm raid get pdisks -o -p name,SasAddress,MediaType
and very useful for VSAN - this post by @sebastiangrugel shows it well.

Thanks to @sjin2008 who made an excellent post on esxcli commands and showed up in google when I needed to get a table relating mount points / UUIDs to Volume Name


Other ways

Googling around I found two related KBs:

KB2014558 tells us the type of installation, but not where it was installed. However, it's a good idea to check the installation type, just in case this host was booted off PXE.
KB2030957 is useful. It will tell you the device, but if you were trying to figure out what datastore it relates to, it doesn't do that and you do the 4th command above.

Here are a few commands related to this.

Good command to show the bootbank , altbootbank and scratch partitions

~ # ls -l /
total 525
lrwxrwxrwx    1 root     root            49 Sep 30  2014 altbootbank -> /vmfs/volumes/6308247d-8d9ec099-ae26-5ec2d9ba9e00
drwxr-xr-x    1 root     root           512 Sep 30  2014 bin
lrwxrwxrwx    1 root     root            49 Sep 30  2014 bootbank -> /vmfs/volumes/dea75b72-115e5271-3e7b-9b3ef7301455
-r--r--r--    1 root     root        300059 Aug 23  2014 bootpart.gz
drwxr-xr-x    1 root     root           512 Jun  3 09:29 dev
drwxr-xr-x    1 root     root           512 Jun  3 08:35 etc
drwxr-xr-x    1 root     root           512 Sep 30  2014 lib
drwxr-xr-x    1 root     root           512 Sep 30  2014 lib64
-r-x------    1 root     root         14040 Sep 29  2014 local.tgz
lrwxrwxrwx    1 root     root             6 Sep 30  2014 locker -> /store
drwxr-xr-x    1 root     root           512 Sep 30  2014 mbr
drwxr-xr-x    1 root     root           512 Sep 30  2014 opt
drwxr-xr-x    1 root     root        131072 Jun  3 09:29 proc
lrwxrwxrwx    1 root     root            22 Sep 30  2014 productLocker -> /locker/packages/5.5.0
lrwxrwxrwx    1 root     root             4 Aug 23  2014 sbin -> /bin
lrwxrwxrwx    1 root     root            49 Sep 30  2014 scratch -> /vmfs/volumes/54255423-287d3e20-5dd5-90b11c4fcfc5
lrwxrwxrwx    1 root     root            49 Sep 30  2014 store -> /vmfs/volumes/5425541a-bbf00056-e4c2-90b11c4fcfc5
drwxr-xr-x    1 root     root           512 Sep 30  2014 tardisks
drwxr-xr-x    1 root     root           512 Sep 30  2014 tardisks.noauto
drwxrwxrwt    1 root     root           512 Jun  3 09:01 tmp
drwxr-xr-x    1 root     root           512 Sep 30  2014 usr
drwxr-xr-x    1 root     root           512 Sep 30  2014 var
drwxr-xr-x    1 root     root           512 Sep 30  2014 vmfs
drwxr-xr-x    1 root     root           512 Sep 30  2014 vmimages
lrwxrwxrwx    1 root     root            17 Aug 23  2014 vmupgrade -> /locker/vmupgrade

All these are a good way of seeing the partitions and their different formats, and the disk devices.

~ # df -h
Filesystem   Size   Used Available Use% Mounted on
VMFS-5     557.5G 212.5G    345.0G  38% /vmfs/volumes/RAID10-4x300GB-SSD
VMFS-5       6.4T   1.2T      5.2T  18% /vmfs/volumes/RAID6-9x1TB-SAS
VMFS-5       1.1T 392.5G    722.7G  35% /vmfs/volumes/RAID10-8x300GB-SSD
vfat         4.0G  28.7M      4.0G   1% /vmfs/volumes/54255423-287d3e20-5dd5-90b11c4fcfc5
vfat       249.7M 157.3M     92.4M  63% /vmfs/volumes/dea75b72-115e5271-3e7b-9b3ef7301455
vfat       249.7M   8.0K    249.7M   0% /vmfs/volumes/6308247d-8d9ec099-ae26-5ec2d9ba9e00
vfat       285.8M 193.4M     92.4M  68% /vmfs/volumes/5425541a-bbf00056-e4c2-90b11c4fcfc5

~ # esxcli storage filesystem list
Mount Point                                        Volume Name         UUID                                 Mounted  Type             Size           Free
-------------------------------------------------  ------------------  -----------------------------------  -------  ------  -------------  -------------
/vmfs/volumes/54290d18-1f6d577a-44e2-90b11c4fcfc5  RAID10-4x300GB-SSD  54290d18-1f6d577a-44e2-90b11c4fcfc5     true  VMFS-5   598611066880   370453512192
/vmfs/volumes/54255423-f706a47a-6e39-90b11c4fcfc5  RAID6-9x1TB-SAS     54255423-f706a47a-6e39-90b11c4fcfc5     true  VMFS-5  6989522403328  5715596935168
/vmfs/volumes/54290d55-396ddecc-89f2-90b11c4fcfc5  RAID10-8x300GB-SSD  54290d55-396ddecc-89f2-90b11c4fcfc5     true  VMFS-5  1197490569216   776033271808
/vmfs/volumes/54255423-287d3e20-5dd5-90b11c4fcfc5                      54255423-287d3e20-5dd5-90b11c4fcfc5     true  vfat       4293591040     4258988032
/vmfs/volumes/dea75b72-115e5271-3e7b-9b3ef7301455                      dea75b72-115e5271-3e7b-9b3ef7301455     true  vfat        261853184       96870400
/vmfs/volumes/6308247d-8d9ec099-ae26-5ec2d9ba9e00                      6308247d-8d9ec099-ae26-5ec2d9ba9e00     true  vfat        261853184      261844992
/vmfs/volumes/5425541a-bbf00056-e4c2-90b11c4fcfc5                      5425541a-bbf00056-e4c2-90b11c4fcfc5     true  vfat        299712512       96935936

~ # esxcli storage core device partition listDevice                                Partition  Start Sector   End Sector  Type           Size
------------------------------------  ---------  ------------  -----------  ----  -------------
naa.6b8ca3a0e704b3001bb807e52a66aad5          0             0  13667139584     0  6997575467008
naa.6b8ca3a0e704b3001bb807e52a66aad5          1            64         8192     0        4161536
naa.6b8ca3a0e704b3001bb807e52a66aad5          2       7086080     15472640     6     4293918720
naa.6b8ca3a0e704b3001bb807e52a66aad5          3      15472640  13667139551    fb  6989653458432
naa.6b8ca3a0e704b3001bb807e52a66aad5          5          8224       520192     6      262127616
naa.6b8ca3a0e704b3001bb807e52a66aad5          6        520224      1032192     6      262127616
naa.6b8ca3a0e704b3001bb807e52a66aad5          7       1032224      1257472    fc      115326976
naa.6b8ca3a0e704b3001bb807e52a66aad5          8       1257504      1843200     6      299876352
naa.6b8ca3a0e704b3001bb807e52a66aad5          9       1843200      7086080    fc     2684354560
naa.6b8ca3a0e704b3001bb807902555b244          0             0   2339373056     0  1197759004672
naa.6b8ca3a0e704b3001bb807902555b244          1          2048   2339373023    fb  1197757939200
naa.6b8ca3a0e704b3001bb806fa1c66c62d          0             0   1169686528     0   598879502336
naa.6b8ca3a0e704b3001bb806fa1c66c62d          1          2048   1169686495    fb   598878436864

~ # esxcli storage core device partition showguid
Device                                Partition  Layout  GUID
------------------------------------  ---------  ------  --------------------------------
naa.6b8ca3a0e704b3001bb807e52a66aad5          0  GPT     00000000000000000000000000000000
naa.6b8ca3a0e704b3001bb807e52a66aad5          1  GPT     c12a7328f81f11d2ba4b00a0c93ec93b
naa.6b8ca3a0e704b3001bb807e52a66aad5          2  GPT     ebd0a0a2b9e5443387c068b6b72699c7
naa.6b8ca3a0e704b3001bb807e52a66aad5          3  GPT     aa31e02a400f11db9590000c2911d1b8
naa.6b8ca3a0e704b3001bb807e52a66aad5          5  GPT     ebd0a0a2b9e5443387c068b6b72699c7
naa.6b8ca3a0e704b3001bb807e52a66aad5          6  GPT     ebd0a0a2b9e5443387c068b6b72699c7
naa.6b8ca3a0e704b3001bb807e52a66aad5          7  GPT     9d27538040ad11dbbf97000c2911d1b8
naa.6b8ca3a0e704b3001bb807e52a66aad5          8  GPT     ebd0a0a2b9e5443387c068b6b72699c7
naa.6b8ca3a0e704b3001bb807e52a66aad5          9  GPT     9d27538040ad11dbbf97000c2911d1b8
naa.6b8ca3a0e704b3001bb807902555b244          0  GPT     00000000000000000000000000000000
naa.6b8ca3a0e704b3001bb807902555b244          1  GPT     aa31e02a400f11db9590000c2911d1b8
naa.6b8ca3a0e704b3001bb806fa1c66c62d          0  GPT     00000000000000000000000000000000
naa.6b8ca3a0e704b3001bb806fa1c66c62d          1  GPT     aa31e02a400f11db9590000c2911d1b8

~ # esxcli system visorfs ramdisk list
Ramdisk Name  System  Include in Coredumps   Reserved      Maximum       Used  Peak Used   Free  Reserved Free  Maximum Inodes  Allocated Inodes  Used Inodes  Mount Point
------------  ------  --------------------  ---------  -----------  ---------  ---------  -----  -------------  --------------  ----------------  -----------  ---------------------------
root            true                  true  32768 KiB    32768 KiB    532 KiB    544 KiB   98 %           98 %            8192              4096         3654  /
etc             true                  true  28672 KiB    28672 KiB    184 KiB    216 KiB   99 %           99 %            4096              1024          463  /etc
tmp            false                 false   2048 KiB   196608 KiB      4 KiB    228 KiB   99 %           99 %            8192               256            3  /tmp
hostdstats     false                 false      0 KiB  1078272 KiB  12704 KiB  12892 KiB   98 %            0 %            8192                32            5  /var/lib/vmware/hostd/stats
snmptraps      false                 false      0 KiB     1024 KiB      0 KiB      0 KiB  100 %            0 %            8192                32            1  /var/spool/snmp

~ # ls -alh /vmfs/devices/disks 
total 17176196976
drwxr-xr-x    1 root     root         512 Jun  3 09:25 .
drwxr-xr-x    1 root     root         512 Jun  3 09:25 ..
-rw-------    1 root     root           0 Jun  3 09:25 mpx.vmhba32:C0:T0:L1
-rw-------    1 root     root      557.8G Jun  3 09:25 naa.6b8ca3a0e704b3001bb806fa1c66c62d
-rw-------    1 root     root      557.7G Jun  3 09:25 naa.6b8ca3a0e704b3001bb806fa1c66c62d:1
-rw-------    1 root     root        1.1T Jun  3 09:25 naa.6b8ca3a0e704b3001bb807902555b244
-rw-------    1 root     root        1.1T Jun  3 09:25 naa.6b8ca3a0e704b3001bb807902555b244:1
-rw-------    1 root     root        6.4T Jun  3 09:25 naa.6b8ca3a0e704b3001bb807e52a66aad5
-rw-------    1 root     root        4.0M Jun  3 09:25 naa.6b8ca3a0e704b3001bb807e52a66aad5:1
-rw-------    1 root     root        4.0G Jun  3 09:25 naa.6b8ca3a0e704b3001bb807e52a66aad5:2
-rw-------    1 root     root        6.4T Jun  3 09:25 naa.6b8ca3a0e704b3001bb807e52a66aad5:3
-rw-------    1 root     root      250.0M Jun  3 09:25 naa.6b8ca3a0e704b3001bb807e52a66aad5:5
-rw-------    1 root     root      250.0M Jun  3 09:25 naa.6b8ca3a0e704b3001bb807e52a66aad5:6
-rw-------    1 root     root      110.0M Jun  3 09:25 naa.6b8ca3a0e704b3001bb807e52a66aad5:7
-rw-------    1 root     root      286.0M Jun  3 09:25 naa.6b8ca3a0e704b3001bb807e52a66aad5:8
-rw-------    1 root     root        2.5G Jun  3 09:25 naa.6b8ca3a0e704b3001bb807e52a66aad5:9
lrwxrwxrwx    1 root     root          20 Jun  3 09:25 vml.0000010000766d68626133323a303a31 -> mpx.vmhba32:C0:T0:L1
lrwxrwxrwx    1 root     root          36 Jun  3 09:25 vml.02000000006b8ca3a0e704b3001bb806fa1c66c62d504552432048 -> naa.6b8ca3a0e704b3001bb806fa1c66c62d
lrwxrwxrwx    1 root     root          38 Jun  3 09:25 vml.02000000006b8ca3a0e704b3001bb806fa1c66c62d504552432048:1 -> naa.6b8ca3a0e704b3001bb806fa1c66c62d:1
lrwxrwxrwx    1 root     root          36 Jun  3 09:25 vml.02000000006b8ca3a0e704b3001bb807902555b244504552432048 -> naa.6b8ca3a0e704b3001bb807902555b244
lrwxrwxrwx    1 root     root          38 Jun  3 09:25 vml.02000000006b8ca3a0e704b3001bb807902555b244504552432048:1 -> naa.6b8ca3a0e704b3001bb807902555b244:1
lrwxrwxrwx    1 root     root          36 Jun  3 09:25 vml.02000000006b8ca3a0e704b3001bb807e52a66aad5504552432048 -> naa.6b8ca3a0e704b3001bb807e52a66aad5
lrwxrwxrwx    1 root     root          38 Jun  3 09:25 vml.02000000006b8ca3a0e704b3001bb807e52a66aad5504552432048:1 -> naa.6b8ca3a0e704b3001bb807e52a66aad5:1
lrwxrwxrwx    1 root     root          38 Jun  3 09:25 vml.02000000006b8ca3a0e704b3001bb807e52a66aad5504552432048:2 -> naa.6b8ca3a0e704b3001bb807e52a66aad5:2
lrwxrwxrwx    1 root     root          38 Jun  3 09:25 vml.02000000006b8ca3a0e704b3001bb807e52a66aad5504552432048:3 -> naa.6b8ca3a0e704b3001bb807e52a66aad5:3
lrwxrwxrwx    1 root     root          38 Jun  3 09:25 vml.02000000006b8ca3a0e704b3001bb807e52a66aad5504552432048:5 -> naa.6b8ca3a0e704b3001bb807e52a66aad5:5
lrwxrwxrwx    1 root     root          38 Jun  3 09:25 vml.02000000006b8ca3a0e704b3001bb807e52a66aad5504552432048:6 -> naa.6b8ca3a0e704b3001bb807e52a66aad5:6
lrwxrwxrwx    1 root     root          38 Jun  3 09:25 vml.02000000006b8ca3a0e704b3001bb807e52a66aad5504552432048:7 -> naa.6b8ca3a0e704b3001bb807e52a66aad5:7
lrwxrwxrwx    1 root     root          38 Jun  3 09:25 vml.02000000006b8ca3a0e704b3001bb807e52a66aad5504552432048:8 -> naa.6b8ca3a0e704b3001bb807e52a66aad5:8
lrwxrwxrwx    1 root     root          38 Jun  3 09:25 vml.02000000006b8ca3a0e704b3001bb807e52a66aad5504552432048:9 -> naa.6b8ca3a0e704b3001bb807e52a66aad5:9

2 comments: